Low Power High Value
Low Power High Value
Low Power High Value
Low Power High Value
The Cost of Decision
The Cost of Decision
The Cost of Decision
The Cost of Decision
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
Low power high value
Since the invention of the steam engine, geopolitical mega mountains were moved to get access to energy and today this theme is heightened. The use of energy is now woven into every fabric of our society and increasingly we are scrutinizing its effectiveness and efficiency per application because there is always a cost.
This cost is related to the value of the output, whether it is air conditioning, electric vehicles, next generation nuclear power plants, building insulation, data centers, deep neural networks operated by financial institutions or LLMs; the equation between use of energy (power) and value output must make sense.
The Cost of Decision
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
The human brain with 86 billion neurons and 100 trillion connections, has enormous processing power. However it only consumes 20 watts per day, the same as a small light bulb.
ChatGPT LLM consumes about the same energy as 30,000 US households per day.
Through evolution the human brain has learned how to optimize energy and resource management. Through the generalization of multiple sensory inputs which optimizes memory storage and search, causality which optimizes learning through action, effects, positive/negative reinforcement. This learning and retraining feedback loop helps us manage sparse parameters.
The fact is LLM design is contrary to natural human brain design, although it tries to imitate the function, it does it through a different design resulting in energy and power requirements equivalent to tens of thousands of households compared to one light bulb.
Even many quantitative finance deep neural network models require millions of parameters. Months of training, massive computing resources and ultimately costs in resources and energy.
On the other hand a Deepmoney model to train one cryptocurrency could be 30 minutes resulting in a 20Mb file size. The model would have a complete live market understanding with a response time of about 0.3 second.
Deepmoney models, Cost of Decision is a key metric, which we continuously optimize to keep it low. The metric encompasses data, decision, learning and action management.
Our thesis targets this type of AI framework, as can be seen in deployed systems. Looking at a use case in the crypto market; 30 cryptocurrencies models encapsulated in Deepmoney are operating on 2x16 GB computers, a core i7 processor with an optional basic GPU.
Just like the brain which optimizes food consumption and converts into energy to survive longer. AI models need to optimize energy consumption at a reasonable cost.
Most of the AI models are built on the idea of Big Data, which is to consume and process more data but this fundamental idea used by quantitative deep neural networks hedge funds or LLM is unnatural compared to the human brain architecture.
Deepmoney have confidence the future lies in Small or Micro Multi-Modal Models which integrate with causal AI, to complete the function of knowledge and experience.
"The best ideas in science are always simple, elegant, and unexpected."
Low power high value
Since the invention of the steam engine, geopolitical mega mountains were moved to get access to energy and today this theme is heightened. The use of energy is now woven into every fabric of our society and increasingly we are scrutinizing its effectiveness and efficiency per application because there is always a cost.
This cost is related to the value of the output, whether it is air conditioning, electric vehicles, next generation nuclear power plants, building insulation, data centers, deep neural networks operated by financial institutions or LLMs; the equation between use of energy (power) and value output must make sense.
The Cost of Decision
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
The human brain with 86 billion neurons and 100 trillion connections, has enormous processing power. However it only consumes 20 watts per day, the same as a small light bulb.
ChatGPT LLM consumes about the same energy as 30,000 US households per day.
Through evolution the human brain has learned how to optimize energy and resource management. Through the generalization of multiple sensory inputs which optimizes memory storage and search, causality which optimizes learning through action, effects, positive/negative reinforcement. This learning and retraining feedback loop helps us manage sparse parameters.
The fact is LLM design is contrary to natural human brain design, although it tries to imitate the function, it does it through a different design resulting in energy and power requirements equivalent to tens of thousands of households compared to one light bulb.
Even many quantitative finance deep neural network models require millions of parameters. Months of training, massive computing resources and ultimately costs in resources and energy.
On the other hand a Deepmoney model to train one cryptocurrency could be 30 minutes resulting in a 20Mb file size. The model would have a complete live market understanding with a response time of about 0.3 second.
Deepmoney models, Cost of Decision is a key metric, which we continuously optimize to keep it low. The metric encompasses data, decision, learning and action management.
Our thesis targets this type of AI framework, as can be seen in deployed systems. Looking at a use case in the crypto market; 30 cryptocurrencies models encapsulated in Deepmoney are operating on 2x16 GB computers, a core i7 processor with an optional basic GPU.
Just like the brain which optimizes food consumption and converts into energy to survive longer. AI models need to optimize energy consumption at a reasonable cost.
Most of the AI models are built on the idea of Big Data, which is to consume and process more data but this fundamental idea used by quantitative deep neural networks hedge funds or LLM is unnatural compared to the human brain architecture.
Deepmoney have confidence the future lies in Small or Micro Multi-Modal Models which integrate with causal AI, to complete the function of knowledge and experience.
"The best ideas in science are always simple, elegant, and unexpected."
Low power high value
Since the invention of the steam engine, geopolitical mega mountains were moved to get access to energy and today this theme is heightened. The use of energy is now woven into every fabric of our society and increasingly we are scrutinizing its effectiveness and efficiency per application because there is always a cost.
This cost is related to the value of the output, whether it is air conditioning, electric vehicles, next generation nuclear power plants, building insulation, data centers, deep neural networks operated by financial institutions or LLMs; the equation between use of energy (power) and value output must make sense.
The Cost of Decision
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
The human brain with 86 billion neurons and 100 trillion connections, has enormous processing power. However it only consumes 20 watts per day, the same as a small light bulb.
ChatGPT LLM consumes about the same energy as 30,000 US households per day.
Through evolution the human brain has learned how to optimize energy and resource management. Through the generalization of multiple sensory inputs which optimizes memory storage and search, causality which optimizes learning through action, effects, positive/negative reinforcement. This learning and retraining feedback loop helps us manage sparse parameters.
The fact is LLM design is contrary to natural human brain design, although it tries to imitate the function, it does it through a different design resulting in energy and power requirements equivalent to tens of thousands of households compared to one light bulb.
Even many quantitative finance deep neural network models require millions of parameters. Months of training, massive computing resources and ultimately costs in resources and energy.
On the other hand a Deepmoney model to train one cryptocurrency could be 30 minutes resulting in a 20Mb file size. The model would have a complete live market understanding with a response time of about 0.3 second.
Deepmoney models, Cost of Decision is a key metric, which we continuously optimize to keep it low. The metric encompasses data, decision, learning and action management.
Our thesis targets this type of AI framework, as can be seen in deployed systems. Looking at a use case in the crypto market; 30 cryptocurrencies models encapsulated in Deepmoney are operating on 2x16 GB computers, a core i7 processor with an optional basic GPU.
Just like the brain which optimizes food consumption and converts into energy to survive longer. AI models need to optimize energy consumption at a reasonable cost.
Most of the AI models are built on the idea of Big Data, which is to consume and process more data but this fundamental idea used by quantitative deep neural networks hedge funds or LLM is unnatural compared to the human brain architecture.
Deepmoney have confidence the future lies in Small or Micro Multi-Modal Models which integrate with causal AI, to complete the function of knowledge and experience.
"The best ideas in science are always simple, elegant, and unexpected."
Low power high value
Since the invention of the steam engine, geopolitical mega mountains were moved to get access to energy and today this theme is heightened. The use of energy is now woven into every fabric of our society and increasingly we are scrutinizing its effectiveness and efficiency per application because there is always a cost.
This cost is related to the value of the output, whether it is air conditioning, electric vehicles, next generation nuclear power plants, building insulation, data centers, deep neural networks operated by financial institutions or LLMs; the equation between use of energy (power) and value output must make sense.
The Cost of Decision
When we started our framework for development back in 2017, every model, algorithm and design implementation by Deepmoney must figure in The Cost of Decision.
The human brain with 86 billion neurons and 100 trillion connections, has enormous processing power. However it only consumes 20 watts per day, the same as a small light bulb.
ChatGPT LLM consumes about the same energy as 30,000 US households per day.
Through evolution the human brain has learned how to optimize energy and resource management. Through the generalization of multiple sensory inputs which optimizes memory storage and search, causality which optimizes learning through action, effects, positive/negative reinforcement. This learning and retraining feedback loop helps us manage sparse parameters.
The fact is LLM design is contrary to natural human brain design, although it tries to imitate the function, it does it through a different design resulting in energy and power requirements equivalent to tens of thousands of households compared to one light bulb.
Even many quantitative finance deep neural network models require millions of parameters. Months of training, massive computing resources and ultimately costs in resources and energy.
On the other hand a Deepmoney model to train one cryptocurrency could be 30 minutes resulting in a 20Mb file size. The model would have a complete live market understanding with a response time of about 0.3 second.
Deepmoney models, Cost of Decision is a key metric, which we continuously optimize to keep it low. The metric encompasses data, decision, learning and action management.
Our thesis targets this type of AI framework, as can be seen in deployed systems. Looking at a use case in the crypto market; 30 cryptocurrencies models encapsulated in Deepmoney are operating on 2x16 GB computers, a core i7 processor with an optional basic GPU.
Just like the brain which optimizes food consumption and converts into energy to survive longer. AI models need to optimize energy consumption at a reasonable cost.
Most of the AI models are built on the idea of Big Data, which is to consume and process more data but this fundamental idea used by quantitative deep neural networks hedge funds or LLM is unnatural compared to the human brain architecture.
Deepmoney have confidence the future lies in Small or Micro Multi-Modal Models which integrate with causal AI, to complete the function of knowledge and experience.
"The best ideas in science are always simple, elegant, and unexpected."
Related Articles
©2024 Deepmoney · All rights reserved.
©2024 Deepmoney · All rights reserved.