Youtube Videos Download Any Online Videos - nicesnippets.com
Only Knowledge facts: Gadgets 360

Wednesday, 29 March 2023

Gadgets 360

Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration.

Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model.

"There is a big movement to close what has been open-sourced in AI...it's not surprising as there's now huge money in it," said Andrew Feldman, founder, and CEO of Cerebras. "The excitement in the community, the progress we've made, has been in large part because it's been so open."

Models with more parameters are able to perform more complex generative functions.

OpenAI's chatbot ChatGPT launched late last year, for example, has 175 billion parameters and can produce poetry and research, which has helped draw large interest and funding to AI more broadly.

Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models.

However, Karl Freund, a chip consultant at Cambrian AI, said bigger is not always better.

"There's been some interesting papers published that show that (a smaller model) can be accurate if you train it more," said Freund. "So there's a trade off between bigger and better trained."

Feldman said his biggest model took a little over a week to train, work that can typically take several months, thanks to the architecture of the Cerebras system, which includes a chip the size of a dinner plate built for AI training.

Most of the AI models today are trained on Nvidia's chips, but more and more startups like Cerebras are trying to take share in that market.

The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.

© Thomson Reuters 2023


From smartphones with rollable displays or liquid cooling, to compact AR glasses and handsets that can be repaired easily by their owners, we discuss the best devices we've seen at MWC 2023 on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.


from Gadgets 360 https://ift.tt/xnM3gN8

Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration.

Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model.

"There is a big movement to close what has been open-sourced in AI...it's not surprising as there's now huge money in it," said Andrew Feldman, founder, and CEO of Cerebras. "The excitement in the community, the progress we've made, has been in large part because it's been so open."

Models with more parameters are able to perform more complex generative functions.

OpenAI's chatbot ChatGPT launched late last year, for example, has 175 billion parameters and can produce poetry and research, which has helped draw large interest and funding to AI more broadly.

Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models.

However, Karl Freund, a chip consultant at Cambrian AI, said bigger is not always better.

"There's been some interesting papers published that show that (a smaller model) can be accurate if you train it more," said Freund. "So there's a trade off between bigger and better trained."

Feldman said his biggest model took a little over a week to train, work that can typically take several months, thanks to the architecture of the Cerebras system, which includes a chip the size of a dinner plate built for AI training.

Most of the AI models today are trained on Nvidia's chips, but more and more startups like Cerebras are trying to take share in that market.

The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.

© Thomson Reuters 2023


From smartphones with rollable displays or liquid cooling, to compact AR glasses and handsets that can be repaired easily by their owners, we discuss the best devices we've seen at MWC 2023 on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.

Labels:

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home