BTC $66491.5139
ETH $3184.3983
BNB $601.6419
SOL $155.6108
stETH $3185.1637
XRP $0.5495
DOGE $0.1586
TON $5.8776
ADA $0.5099
AVAX $38.8942
wstETH $3708.0998
WBTC $66518.7062
DOT $7.3764
WETH $3184.7962
TRX $0.1114
BCH $512.3077
LINK $15.4337
MATIC $0.7334
UNI $8.1252
ICP $14.9003
LTC $84.7276
DAI $0.9990
CAKE $2.9972
RNDR $9.1863
IMX $2.3935
STX $3.0458
NEAR $6.9983
ETC $28.0765
FDUSD $1.0009
MNT $1.2093
FIL $6.5339
TAO $511.7762
OKB $54.7486
HBAR $0.0893
VET $0.0421
KAS $0.1250
ATOM $8.8532
GRT $0.3029
PEPE $0.0000
WIF $2.8536
FET $2.4350
MKR $2854.7795
INJ $28.3839
THETA $2.3975
USDE $0.9992
XLM $0.1167
CORE $2.5851
BTC $66491.5139
ETH $3184.3983
BNB $601.6419
SOL $155.6108
stETH $3185.1637
XRP $0.5495
DOGE $0.1586
TON $5.8776
ADA $0.5099
AVAX $38.8942
wstETH $3708.0998
WBTC $66518.7062
DOT $7.3764
WETH $3184.7962
TRX $0.1114
BCH $512.3077
LINK $15.4337
MATIC $0.7334
UNI $8.1252
ICP $14.9003
LTC $84.7276
DAI $0.9990
CAKE $2.9972
RNDR $9.1863
IMX $2.3935
STX $3.0458
NEAR $6.9983
ETC $28.0765
FDUSD $1.0009
MNT $1.2093
FIL $6.5339
TAO $511.7762
OKB $54.7486
HBAR $0.0893
VET $0.0421
KAS $0.1250
ATOM $8.8532
GRT $0.3029
PEPE $0.0000
WIF $2.8536
FET $2.4350
MKR $2854.7795
INJ $28.3839
THETA $2.3975
USDE $0.9992
XLM $0.1167
CORE $2.5851
  • Catalog
  • Blog
  • Tor Relay
  • Jabber
  • One-Time notes
  • Temp Email
  • What is TOR?
  • We are in tor
  • OpenAI hit the ceiling in the development of neural networks

    According to OpenAI CEO Sam Altman, the trend of regularly presenting ever larger AI models to the public is coming to an end. The extremely rapid development of neural networks, which people have just begun to get used to, should now slow down significantly.

    Speaking at an MIT event, Altman suggested that further progress in the field of artificial intelligence and neural networks will not be achieved through giant language models trained on trillions of parameters.

    “I think we are at the end of an era where such gigantic language models will be used. Further, we will improve them in other ways,” Altman said.

    Although the head of OpenAI did not emphasize this, there is a certain reason why the size of the current language models will have to stop. It is clear that training large language models takes a lot of time and human resources, but this is not the main reason. More importantly, all the benefits of using larger models will no longer be adequately weighed against the exorbitant cost of training them.

    Since all the high-performance capacity needed to train language models is most often rented to developers on the side, the costs are simply fabulous. For example, OpenAI had to use 10,000 Nvidia GPUs to train the same ChatGPT. Needless to say, what cosmic sums were spent on it? Hundreds of millions of dollars.

    Another factor that can slow down the development of AI could be the lack of the aforementioned capacities. For example, some large companies prefer not to rent GPUs, but to buy them. And, since demand now clearly exceeds supply, even conditional Elon Musk, who has already paid for 10,000 GPUs for the development of neural networks in his products, will have to wait. And the wait can take months.

    “I think that too much attention has been paid all this time to the number of parameters for training language models. This is very reminiscent of the gigahertz chip race in the 1990s and 2000s,” Altman said.

    However, progress does not stand still. The performance of graphics chips is constantly increasing, and the cost of buying or renting chips will decrease over time. Even if now the head of OpenAI says that there is no need to make larger language models, it does not mean that such a need will not appear in a few years. As Bill Gates said, "640KB of memory should be enough for everyone."

    Author DeepWeb
    Brazilian authorities decided to block Telegram in the country
    Israel remains the Number 1 target for hacktivists of all stripes

    Comments 0

    Add comment