Skip to main content

This Week in Generative AI – 3rd to 9th November

By November 10, 2023January 8th, 2024No Comments
This week in Generative

OpenAI unveils custom GPTs, Store, and Assistant API, at Dev Day

  • OpenAI held its first-ever developers conference, Dev Day, where it unveiled custom versions of ChatGPT, a store for purchasing these custom versions, and an “Assistant” API for developers.
  • Microsoft CEO Satya Nadella was a guest speaker, praising the partnership with OpenAI.
  • Other announcements included a Copyright Shield, fine-tuning service for GPT-4, and a new user interface for ChatGPT.

Read the Article

Amazon training large language model codenamed ‘Olympus’ to rival OpenAI, Microsoft

  • Amazon is reportedly developing a large language model (LLM) codenamed ‘Olympus’ with 2 trillion parameters, potentially rivaling models from OpenAI, Microsoft, and Google.
  • Olympus is expected to enhance Amazon’s e-commerce platform, Alexa voice assistant, and AWS services.
  • The model could be announced as early as December.

Read the Article

Apple is getting “serious” about generative AI, CEO Tim Cook hints during earnings call

  • Apple CEO Tim Cook has confirmed the company’s heavy investment in generative AI, following the success of OpenAI’s ChatGPT.
  • While details are limited, Cook stated that Apple is working responsibly on the technology, with product advancements expected.
  • The company’s research budget has reportedly increased to $23 billion annually.

Read the Article

Samsung unveils Gauss – its own generative AI model to rival ChatGPT

  • Samsung has unveiled Gauss, a generative AI model designed to rival ChatGPT.
  • The model, which includes Gauss Language, Gauss Code, and Gauss Image, is currently used internally to enhance workflows.
  • Samsung plans to incorporate Gauss into consumer products, offering features such as text generation, AI coding assistance, and AI image generation.

Read the Article

NVIDIA’s Eos supercomputer just broke its own AI training benchmark record

  • NVIDIA’s Eos supercomputer, powered by over 10,000 H100 Tensor Core GPUs, has broken its own AI training benchmark record.
  • It can now train a 175 billion-parameter GPT-3 model in under four minutes, three times faster than the previous record.
  • The supercomputer’s improved performance is attributed to software optimization and increased GPU usage.

Read the Article