pds-it
['Blog post','no']
Amazon Web Services
Blog

AWS in April 2024: New functions and applications

Contents

    As every month, AWS added some new functions and presented new applications in April 2024. Amazon Web Services are regularly updated to keep them up to date. We provide a small selection of important developments. This month, including new resource types and Llama 3.

    AWS in April 2024 brings version 2024.04 for Research and Engineering Studio

    The Research and Engineering Studio (RES) in AWS was updated to the new version 2024.04 in April 2024. The new version of RES brings new options for shared storage. Expanded support for Amazon FSx for Lustre creates an additional option for shared storage.

    The most important new feature is the Virtual Desktop Infrastructure (VDI). This allows virtual desktops to be created and streamed. Since version 2024.04, users can start these VDIs with Amazon Machine Images (AMI). Early provisioning in an AMI reduces the boot time of VDIs and access to the internet is no longer necessary.

    Other user-defined settings include start scripts, security groups and instance profiles.

    New instance types and new resource types in AWS in April 2024

    Two new instance types with increased performance have been announced for Amazon WorkSpaces Web. This increases performance when running complex workloads. Improvements are particularly evident when streaming audio and video and when processing large files. With the new instance types, Amazon WorkSpaces Web now supports a total of three instance types: standard.regular, standard. Large and standard.xlarge.

    Regular types run simple websites with static text and images, while large types have twice as much memory and therefore more performance. Xlarge types have twice as much virtual computing power and twice as much RAM as large types and are best suited for audio and video streaming.

    The AWS Config application has also received new resource types. There are a total of 35 new types for extended queries. These can be used to search configuration methods of AWS resources and determine the status.

    Llama 3 in AWS in April 2024

    Llama 3, Meta's language model, is now available in the basic version in AWS. Specifically, Llama 3 can be used via Amazon SageMaker JumpStart.

    Llama 3 has been available in AWS since April 2024 in different parameter sizes. Either 8 billion or 70 billion. The areas of application are in particular for reasoning and code generation. With the pure decoder transformation architecture of Llama 3, the language model offers pre-improved model performance.

    Meta has equipped Llama 3 with a number of improvements. Llama 3 can be executed in an AWS environment via SageMaker Pipelines, SageMaker Debugger and container protocols.

    Learn AWS Machine Learning with skill it

    In skill it seminars, you will learn machine learning in AWS or sharpen your AI skills.

    In our four-day course The Machine Learning Pipeline on AWS you will learn how to solve real business problems in the machine learning environment. The course prepares you for the 'AWS Certified Machine Learning (Specialty Level)' certification.

    With Amazon SageMaker Studio for Data Scientists advanced users of AWS Machine Learning expand their skills with practical exercises.

    In training MLOps Engineering on AWS you will learn how to combine machine learning with DevOps. The training lasts three days and lets you get hands-on with numerous practical applications.

    Author
    Marcel Michaelsen
    Marcel writes IT content for websites as a freelancer at Textflamme. The topics range from product descriptions to complex technical articles.