{"id":2720819,"date":"2023-06-13T02:25:50","date_gmt":"2023-06-13T06:25:50","guid":{"rendered":"https:\/\/wordpress-1016567-4521551.cloudwaysapps.com\/plato-data\/zuckerberg-says-metas-metaverse-more-inclusive-than-apples\/"},"modified":"2023-06-13T02:25:50","modified_gmt":"2023-06-13T06:25:50","slug":"zuckerberg-says-metas-metaverse-more-inclusive-than-apples","status":"publish","type":"station","link":"https:\/\/platodata.io\/plato-data\/zuckerberg-says-metas-metaverse-more-inclusive-than-apples\/","title":{"rendered":"Zuckerberg Says Meta’s Metaverse More Inclusive Than Apple’s"},"content":{"rendered":"
<\/div>\n

OpenAI, the Microsoft-backed startup, will not train the successor of GPT-4 \u201cfor some time\u201d because of the concerns raised about the speed of its advancement.<\/strong><\/p>\n

The American startup launched ChatGPT, powered by the GPT-3 large language model, last November.<\/p>\n

Iterative improvement<\/h2>\n

OpenAI launched one of the most anticipated generations of LLM, GPT-4, in March. GPT-4 is the latest and most advanced version of the firm\u2019s large language models<\/a>, which are incorporated into ChatGPT and several other applications. Already however, fans and commentators are looking ahead to GPT-5.<\/p>\n

Just last week<\/a> at a conference held by India\u2019s Economic Times <\/em>the OpenAI CEO was asked when the company intends to start training its next LLM.<\/p>\n

\n

\u201cWe have a lot of work to do before we start that model. We\u2019re working on the new ideas that we think we need for it, but we are certainly not close to it to start,\u201d said Altman.<\/p>\n<\/blockquote>\n

Following the launch of GPT-4 and other relevant developments in the generative AI era, more than 1,000 tech leaders wrote an open letter calling for a pause on all major AI development<\/a> and training until developers can better understand how these technologies function.<\/p>\n

Elon Musk, Apple co-founder Steve Wozniak, and other notable figures signed the letter, which received over 1,377 signatures.<\/p>\n

The letter also included signatures from Turing Prize winner Yoshua Bengio, professor of computer science Stuart Russell, Stability AI CEO Emad Mostaque, Getty Images CEO Craig Peters, and several other tech executives and prominent scientists.<\/p>\n

A few weeks later, Altman said  that the letter was \u201cmissing most technical nuance about where we need the pause,\u201d but intended OpenAI had not started training GPT-5 and was not in the immediate plan for \u201csome time.\u201d<\/p>\n

\n

Interestingly, @OpenAI<\/a> are still not training #GPT5<\/a>, the successor to #GPT4<\/a> that lies behind #chatGPT<\/a> due to concerns about the dangers posed by the misuse of large language models (#LLM<\/a>|s). More details here: https:\/\/t.co\/sFYZ353lIo<\/a><\/p>\n

\u2014 Oliver Lord (@olivertlord) June 9, 2023<\/a><\/p>\n<\/blockquote>\n

Altman has once again addressed the concerns raised by prominent voices in the field of AI. The startup has taken proactive steps to mitigate potential risks associated with AI by implementing rigorous safety measures, indicated Altman.<\/p>\n

\n

\u201cWhen we finished GPT-4, it took us more than six months until we were ready to release it,\u201d said Altman.<\/p>\n<\/blockquote>\n

Altman previously emphasized in the interview that OpenAI opposes the regulation of smaller AI startups.<\/p>\n

\u201cThe only regulation we have called for is on ourselves and people bigger,\u201d argued the CEO.<\/p>\n

Read Also: White House Takes Steps to Study AI Risks, Assess Impact on Workers<\/em><\/a><\/strong><\/p>\n

Waiting for better GPU?<\/h2>\n

The speculation on Twitter is that OpenAI may be waiting for better hardware to train the next generation of powerful GPT models.<\/p>\n

\u201cWaiting for GPUs and better data from GPT-4 interactions,\u201d tweeted<\/a> Bert Kastel in reaction to the news regarding OpenAI\u2019s delay.<\/p>\n

\u201cNot enough Hardware for GPT-4 even,\u201d argued Cristoph C. Cemper, Chief Prompt Officer at AIPRM.<\/p>\n

\n

not enough Hardware for GPT-4 even<\/p>\n

\u2014 Christoph C. Cemper \ud83c\uddfa\ud83c\udde6 \ud83e\udde1 SEO&AI (@cemper) June 8, 2023<\/a><\/p>\n<\/blockquote>\n

Oliver Lord, Head of Petrology at the University of Bristol Earth Sciences, has mentioned<\/a> that OpenAI has not yet begun training GPT-5, which is the successor to GPT-4, the model powering ChatGPT.<\/p>\n

The reason behind this decision is their concerns about the potential dangers associated with the misuse of LLMs, argued Lord.<\/p>\n

[embedded content]<\/p>\n

Microsoft to offer GPT models to government<\/h2>\n

Tech giant Microsoft is bringing the language-producing models from OpenAI to US federal agencies using its Azure cloud service, according<\/a> to Reuters.<\/p>\n

The company has extended its support within Azure Government to include the integration of OpenAI\u2019s advanced LLMs, including the latest and highly sophisticated GPT-4, along with GPT-3.<\/p>\n

It is the first time Microsoft is bringing the GPT technology to Azure Government, which offers cloud solutions to US government agencies.<\/p>\n

This marks the first such effort by a major company to make chatbot technology available to governments.<\/p>\n