pds-it
['Blog post','no']
Amazon Web Services
Blog

Olympus: Amazon develops its own large language model

Contents

    Amazon's Large Language Model (LLM), dubbed "Olympus", is another step by the large corporation to compete with ChatGPT. A few weeks ago, Amazon invested several billion US dollars in the AI developer Anthropic in order to be able to use its ChatGPT competitor "Claude 2" exclusively for AWS applications. The race for supremacy in the field of artificial intelligence remains exciting.

    What is Olympus?

    Amazon Olympus is a Large Language Model (LLM). Like ChatGPT, for example, Olympus will answer questions and create texts. However, the market for artificial intelligence is growing rapidly and Amazon will need to have some advantages to compete with the already established competition.

    Whereby AWS hardly has a choice. Microsoft has secured the use of the currently best-known LLM ChatGPT by investing in the developer OpenAI. Google uses PaLM 2, a large language model developed in-house. In order to keep pace with its competitors, Amazon is forced to become active itself, so to speak.

    This is where Olympus comes into play, which will logically be available exclusively for Amazon and its customers. The assumption is that Olympus will be integrated into Amazon Web Services (AWS) and support most - if not all - applications there.

    But what can Olympus do that other large language models can't?

    How Olympus wants to set itself apart from the competition

    The current top dog on the market is clearly GPT-4 from OpenAI. The LLM was trained with 1.76 trillion parameters. In comparison, the predecessor model GPT-3 only had a meagre 175 million parameters - which is still an incredible amount and corresponds to a training time of the equivalent of 355 years.

    Amazon Olympus naturally wants to surpass this and is to be trained with 2 trillion parameters. This would make it the largest LLM currently available. The advantages are obvious: more training data ensures more detailed and more human answers. In theory, incorrect information should at least be reduced.

    The team developing and training Olympus already has experience. In the past, Amazon has already trained smaller LLMs such as the generative AI "Titan", which is also integrated into AWS. The collaboration with AI developer Anthropic and AI21 Labs should also be helpful in developing an LLM that can face up to the competition. However, it is not yet known when Olympus will be released. According to Amazon, it has not yet set a fixed time frame for this.

    We will probably hear a lot more from Olympus in 2024 and may even be able to expect a release. Until then, we will continue to use the applications in AWS as they are currently available.

    Using applications in AWS professionally with skill it

    AI is already an important topic when dealing with Amazon Web Services. In ourAWS Technical Essentials training , you will learn the most important basics for the technical handling of applications in AWS. Do you already have experience with AWS and want to improve your skills? Then our three-day Advanced Architecting on AWS course is ideal for getting to know new applications and possibilities. This course is also available as a one-day version with a small competition among the participants in the Advanced Architecting on AWS - JAM Day.

    Author
    Marcel Michaelsen
    Marcel writes IT content for websites as a freelancer at Textflamme. The topics range from product descriptions to complex technical articles.