OpenAI全网疯传的53页PDF文档:计划2027年前开发出通用人工智能(英)

Revealing OpenAI’s plan to create AGI by 2027 In this document I will be revealing information I have gathered regarding OpenAI’s (delayed) plans to create human-level AGI by 2027. Not all of it will be easily verifiable but hopefully there’s enough evidence to convince youSummary: OpenAI started training a 125 trillion parameter multimodal model in August of 2022. The first stage was Arrakis also called Q*. The model finished training in December of 2023 but the launch was canceled due to high inference cost. This is the original GPT-5 which was planned for release in 2025. Gobi (GPT-4.5) has been renamed to GPT-5 because the original GPT-5 has been canceled.The next stage of Q*, originally GPT-6 but since renamed to GPT-7 (originally for release in 2026), has been put on hold because of the recent lawsuit by Elon MuskQ* 2025 (GPT-8) was planned to be released in 2027 achieving full AGI...Q* 2023 = 48 IQQ* 2024 = 96 IQ (delayed)Q* 2025 = 145 IQ (delayed)Elon Musk caused the delay because of his lawsuit. This is why I’m revealing the information now because no further harm can be done I’ve seen many definitions of AGI – artificial general intelligence – but I will define AGI simply as an artificial intelligence that can do any intellectual task a smart human can. This is how most people define the term now.2020 was the first time I was shocked by an AI system – that was GPT-3. GPT-3.5, an upgraded version of GPT-3, is the model behind ChatGPT. When ChatGPT was released, I felt as though the wider world was finally catching up to something I was interacting with 2 years prior. I used GPT-3 extensively in 2020 and was shocked by its ability to reason.GPT-3, and its half-step successor GPT-3.5 (which powered the now famous ChatGPT -- before it was upgraded to GPT-4 in March 2023), were a massive step towards AGI in a way that earlier models weren’t. The thing to note is, earlier language models like GPT-2 (and basically all chatbots since Eliza) had no real ability to respond coherently at all. So why was GPT-3 such a massive leap?... Parameter Count“Deep learning” is a concept that essentially goes back to the beginning of AI research in the 1950s. The first neural network was created in the 50s, and modern neural networks are just “deeper”, meaning, they contain more layers – they’re much, much bigger and trained on lots more data. Most of the major techniques used in AI today are rooted in basic 1950s research, combined with a few minor engineering solutions like “backpropogation” and “transformer models”. The overall point is that AI research hasn’t fundamentally changed in 70 years. So, there’s only two real reasons for the recent explosion of AI capabilities: size and data.A growing number of people in the field are beginning to believe we’ve had the technical details of AGI solved for many decades, but merely didn’t have enough computing power and data to build it until the 21st century. Obviously, 21st century computers are vastly more

立即下载
信息科技
2024-03-13
OPENAI
53页
4.6M
收藏
分享

[OPENAI]:OpenAI全网疯传的53页PDF文档:计划2027年前开发出通用人工智能(英),点击即可下载。报告格式为PDF,大小4.6M,页数53页,欢迎下载。

本报告共53页,只提供前10页预览,清晰完整版报告请下载后查看,喜欢就下载吧!
立即下载
本报告共53页,只提供前10页预览,清晰完整版报告请下载后查看,喜欢就下载吧!
立即下载
水滴研报所有报告均是客户上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作商用。
相关图表
瀛通通讯费用情况(万元)图 21:瀛通通讯现金流情况(万元)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
瀛通通讯营业收入情况(万元,%)图 19:瀛通通讯归母净利润情况(万元,%)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
瀛通转债价格、溢价率、转换价值(元,%)图 17:瀛通转债对应正股价格情况(元)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
佳禾智能费用情况(万元)图 15:佳禾智能现金流情况(万元)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
佳禾智能营业收入情况(万元,%)图 13:佳禾智能归母净利润情况(万元,%)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
佳禾转债价格、溢价率、转换价值(元,%)图 11:佳禾转债对应正股价格情况(元)
信息科技
2024-03-13
来源:消费电子可转债梳理(二):开放式耳机
查看原文
回顶部
报告群
公众号
小程序
在线客服
收起