Meta Connect Keynote 2024: The Biggest Reveals

Meta Connect Keynote 2024: The Biggest Reveals

From AI innovations to new hardware, the keynote was a fast-paced showcase of Meta’s ambitious vision for the future. Zuckerberg and his team introduced a slew of AI tools, including celebrity voices and real-time translation for Meta AI, the new Quest 3S headset, and a futuristic prototype for AR glasses, alongside updates for the Ray-Ban Meta Smart Glasses.

The Meta Quest 3S

Meta has officially introduced the Meta Quest 3S, a more affordable addition to its mixed-reality headset line-up which will now become the entry-level option for those wanting to explore mixed reality without breaking the bank.

The Quest 3S retains a similar design to the Quest 3, with its most notable difference being a redesigned front, particularly around the camera stack. Powered by the Snapdragon XR2 Gen 2 processor, the Quest 3S runs Horizon OS smoothly, effortlessly managing multitasking and gaming, including popular titles like Batman: Arkham Shadow.

The Quest 3 is equipped with Pancake lenses and features continuous IPD adjustment, offering a resolution of 2,064 x 2,208, a pixel density of 1,218 PPI (pixels per inch), and 25 PPD (pixels per degree). It also provides an impressive field of view (FOV) of 110° horizontally and 96° vertically.

In contrast, the Quest 3S uses Fresnel lenses with a three-position IPD adjustment, delivering a lower resolution of 1,832 x 1,920, a pixel density of 773 PPI, and 20 PPD. Its FOV is slightly narrower, measuring 96° horizontally and 90° vertically. While the Quest 3S may not offer visuals as sharp and clear as the Quest 3, it comes at a more affordable price, roughly £120 cheaper (at time of writing this article), making it a more budget-friendly option.

Check out our article on how the Apple Vision Pro stacks up against the Meta Quest 3.

Meta Quest for business

Meta’s vision is revolutionising the way we work, particularly for businesses looking to integrate cutting-edge technology into their workflows. Meta provided an update with its close partnership with Microsoft to enhance remote desktop functionality. The new remote desktop integration will allow users to easily pair with any Windows 11 device through a simple glance at the keyboard.

Once paired, you can dive into an immersive work environment where the limitations of physical displays are a thing of the past. This gives users as much space as they need to get the job done. Whether you prefer to spread your work across many multiple displays or stick to one large, centralised monitor, the remote desktop functionality will cater to all types of workflows.

This is particularly beneficial for industries that demand multitasking, such as design, engineering, and financial services, where multiple windows and applications need to be open simultaneously.

Meta AI

Meta introduced Llama 3.2, the latest iteration of its large language model, now featuring multimodal capabilities. This significant upgrade allows the AI to process and interpret a broader range of inputs, including text, images, and even voice. By incorporating multimodality, Llama 3.2 is designed to deliver more intuitive and dynamic user interactions, making it a powerful tool for various applications from productivity tools to creative endeavours.

One of the standout features of Llama 3.2 is its ability to seamlessly switch between different types of inputs. For instance, users can ask questions through text and receive answers in both written and visual formats, or provide images that the AI can interpret and respond to with contextually accurate information. This level of flexibility opens up new possibilities for content creation, problem-solving, and collaboration, giving users more control over how they interact with and utilise the AI in their daily tasks.

In addition to its technical prowess, Meta emphasised the role of Llama 3.2 in enhancing accessibility and user experience across its platforms. The AI can now provide real-time translations, offer detailed image descriptions, and even engage in voice-activated interactions, making it a versatile tool for businesses and individuals alike.

Check out our article for a comparison on Llama 3.2 and Chat GPT-4.

Major Updates for Ray-Ban Meta Smart Glasses (No New Model Yet)

While Meta didn’t unveil a new model of the Ray-Ban Meta Smart Glasses, there were several substantial updates. These smart glasses will soon offer hands-free access to music, audiobooks, and podcasts via services like Spotify and Audible. Even better, Meta AI enables users to request music by genre or artist, moving toward more natural, conversational AI interactions.

Beyond entertainment, the smart glasses are set to handle language translation by the end of the year. Imagine looking at a sign in Spanish, French, or Italian, and asking Meta AI to translate it on the spot—it worked impressively well during the demos. Additional updates include faster lens transitions into sunglasses and the integration of Meta AI for video. All these enhancements will roll out before 2025.

Meta AI Translation for Instagram Reels

Meta is bringing a game-changing AI translation tool to Instagram and Facebook Reels, allowing content creators to connect with a broader audience. This tool automatically dubs and lip-syncs videos in different languages, making content accessible across language barriers.

Currently being tested in the US and Latin America, the feature could revolutionise how people consume video content, as it allows users to watch Reels from around the world in their own language without the need for subtitles. Meta hopes to expand this tool to support more languages in the near future.

A First Look at the Prototype Orion AR Glasses

The biggest surprise of the keynote came when Zuckerberg unveiled Meta’s first fully holographic AR glasses, dubbed Orion. These sleek, futuristic glasses resemble traditional eyewear like the Ray-Ban Meta Smart Glasses, though they’re slightly bulkier to accommodate all the advanced tech inside. Orion features a brand-new display architecture, offering a fully transparent view of the world around you, with content overlaid seamlessly onto your surroundings.

What’s even more exciting is the innovative control system—users can interact with the glasses via hand gestures, made possible by a neural-network-powered wristband. In the demos shown, the Orion glasses were used for gaming and displaying virtual windows that can appear in your line of sight. There was even a snippet of video calling with avatars, highlighting the potential for personalisation in virtual communications.

The Orion prototype impressed with its lightness, thanks to a magnesium frame, and instead of traditional glass lenses, they use silicon carbide to project light for high-quality displays. While still a few years away from consumer release, Meta announced plans to offer early developer kits, though they’ll need to bring costs down before Orion becomes a mainstream product.

The Future of Meta’s Vision

Meta Connect 2024 was a showcase of how the company is advancing on all fronts—AR, VR, and AI. From the ground breaking Orion AR glasses to the budget-friendly Quest 3S, Meta is positioning itself at the forefront of next-generation technology. With AI becoming more integrated into our daily lives, the future Zuckerberg and his team are building is one where the boundaries between physical and virtual worlds continue to blur.