BTC $68279.0927
ETH $3629.5433
BNB $418.8260
SOL $133.0943
XRP $0.6497
stETH $3620.7348
ADA $0.7707
DOGE $0.1827
AVAX $43.0376
DOT $9.8988
wstETH $4204.4051
TRX $0.1402
LINK $20.4333
WETH $3627.8562
MATIC $1.1461
WBTC $68015.7231
UNI $12.3628
BCH $469.5171
LTC $88.8112
IMX $3.1360
ICP $13.3800
CAKE $3.3523
ETC $35.9657
FIL $10.0244
LEO $4.8744
ATOM $12.4782
TON $2.7811
HBAR $0.1174
RNDR $7.3750
KAS $0.1614
INJ $40.6866
DAI $0.9990
OKB $56.8390
VET $0.0495
PEPE $0.0000
XLM $0.1458
FDUSD $0.9965
STX $3.0333
XMR $148.4317
WEMIX $2.7041
LDO $3.2821
NEAR $4.3354
GRT $0.3080
ARB $1.9787
THETA $2.3471
APEX $2.6824
BSV $115.5449
BTC $68279.0927
ETH $3629.5433
BNB $418.8260
SOL $133.0943
XRP $0.6497
stETH $3620.7348
ADA $0.7707
DOGE $0.1827
AVAX $43.0376
DOT $9.8988
wstETH $4204.4051
TRX $0.1402
LINK $20.4333
WETH $3627.8562
MATIC $1.1461
WBTC $68015.7231
UNI $12.3628
BCH $469.5171
LTC $88.8112
IMX $3.1360
ICP $13.3800
CAKE $3.3523
ETC $35.9657
FIL $10.0244
LEO $4.8744
ATOM $12.4782
TON $2.7811
HBAR $0.1174
RNDR $7.3750
KAS $0.1614
INJ $40.6866
DAI $0.9990
OKB $56.8390
VET $0.0495
PEPE $0.0000
XLM $0.1458
FDUSD $0.9965
STX $3.0333
XMR $148.4317
WEMIX $2.7041
LDO $3.2821
NEAR $4.3354
GRT $0.3080
ARB $1.9787
THETA $2.3471
APEX $2.6824
BSV $115.5449
  • Catalog
  • Blog
  • Tor Relay
  • Jabber
  • One-Time notes
  • Temp Email
  • What is TOR?
  • We are in tor
  • OpenAI hit the ceiling in the development of neural networks

    According to OpenAI CEO Sam Altman, the trend of regularly presenting ever larger AI models to the public is coming to an end. The extremely rapid development of neural networks, which people have just begun to get used to, should now slow down significantly.

    Speaking at an MIT event, Altman suggested that further progress in the field of artificial intelligence and neural networks will not be achieved through giant language models trained on trillions of parameters.

    “I think we are at the end of an era where such gigantic language models will be used. Further, we will improve them in other ways,” Altman said.

    Although the head of OpenAI did not emphasize this, there is a certain reason why the size of the current language models will have to stop. It is clear that training large language models takes a lot of time and human resources, but this is not the main reason. More importantly, all the benefits of using larger models will no longer be adequately weighed against the exorbitant cost of training them.

    Since all the high-performance capacity needed to train language models is most often rented to developers on the side, the costs are simply fabulous. For example, OpenAI had to use 10,000 Nvidia GPUs to train the same ChatGPT. Needless to say, what cosmic sums were spent on it? Hundreds of millions of dollars.

    Another factor that can slow down the development of AI could be the lack of the aforementioned capacities. For example, some large companies prefer not to rent GPUs, but to buy them. And, since demand now clearly exceeds supply, even conditional Elon Musk, who has already paid for 10,000 GPUs for the development of neural networks in his products, will have to wait. And the wait can take months.

    “I think that too much attention has been paid all this time to the number of parameters for training language models. This is very reminiscent of the gigahertz chip race in the 1990s and 2000s,” Altman said.

    However, progress does not stand still. The performance of graphics chips is constantly increasing, and the cost of buying or renting chips will decrease over time. Even if now the head of OpenAI says that there is no need to make larger language models, it does not mean that such a need will not appear in a few years. As Bill Gates said, "640KB of memory should be enough for everyone."

    Author DeepWeb
    Brazilian authorities decided to block Telegram in the country
    Israel remains the Number 1 target for hacktivists of all stripes

    Comments 0

    Add comment