Meta’s Metaverse Division Restructured, AI Integration & Leadership Changes Drive Innovation

2 min read

Meta Reportedly Reshapes Metaverse Division Amid Leadership Shifts, Putting AI at Its Core

Meta Reorganizes Focus Towards AI Integration

Meta is reportedly undergoing a significant restructuring of its Reality Labs and Metaverse divisions, enhancing its commitment to integrating artificial intelligence across its range of products, according to a report from Business Insider.

Leadership Changes in Meta’s AI and Metaverse Divisions

In a memo obtained by Business Insider, Andrew Bosworth, Chief Technology Officer at Meta, announced that Vishal Shah, previously at the helm of the Metaverse, will now lead AI Products within the newly established Meta Superintelligence Labs (MSL). This division is dedicated to developing and implementing “personal superintelligence” into Meta’s platforms. Shah, who has directed Reality Labs for the last four years, will now manage AI integrations across the Family of Apps (FoA) and Reality Labs, under the guidance of MSL head Nat Friedman.

Continued Commitment to the Metaverse

Bosworth emphasized in his memo that the focus on metaverse development remains a high priority for the company, stating, “The priority of the metaverse work remains unchanged, and it continues to be a companywide priority.” He asserted that Meta has successfully established its position in the industry, which is evident as competitors strive to match its pace, necessitating a continual effort to maintain its competitive edge.

AI as the New Frontier of Metaverse Innovation

The memo highlighted Shah’s transition as pivotal for Meta’s strategic vision of merging metaverse advancements with the concept of “personal superintelligence”—an aspiration articulated by CEO Mark Zuckerberg in July 2025. Shah described his shift as both “difficult yet exciting,” acknowledging that the initial excitement around the metaverse has diminished. He suggested that AI represents a transformative force that will create more personalized, context-aware experiences that seamlessly connect the virtual and physical realms.

New Leadership Roles and Responsibilities

Gabriel Aul will assume Shah’s former responsibilities, leading the Metaverse Product Group, while Jason Rubin, Samantha Ryan, and Thamara Sekhar will report to him. Aul will also oversee the Horizon Experiences team, recently led by Saxs Persson. Additionally, Ryan Cairns will continue in his role directing Horizon OS, which has now become an organization-level product group reporting directly to Bosworth. The structure and mission of Horizon OS will remain unchanged as it continues to concentrate on quality hardware and software development for the metaverse, especially with significant launches on the horizon.

Strategic Focus on Execution and Market Position

Bosworth reiterated that the metaverse remains a central focus, pointing out that while VR has expanded beyond entertainment into productivity and social interaction, the company must execute its strategy effectively. He noted that mobile platforms are increasingly appealing to younger gamers, and the development of AI creation tools is crucial to enhance the overall user experience.

A Shift Towards AI-Driven Innovation

Despite Meta’s initial emphasis on the metaverse, there appears to be a strategic pivot toward prioritizing AI technologies within the company. Shah’s transition marks a significant shift as he moves deeper into the heart of Meta’s evolving strategy. While the company still sees the metaverse as essential for human interaction, the current focus appears to emphasize AI as a foundational element rather than a replacement for the metaverse vision.

The Future of Smart Glasses and AI Integration

Meta has recently shifted its resources towards smart glasses, which are often referred to as ‘AI glasses.’ This strategic move comes as VR development has proven costly and slow in generating returns on substantial investments made by the company. The potential for smart glasses to appeal to a broader consumer base could provide Meta with the opportunity to showcase its capabilities in extended reality (XR).

AI’s Role in Enhancing User Experience

AI is expected to play a critical role in the future of augmented reality (AR) platforms. The technology aims to bridge the gap between user input and output, enhancing the overall experience significantly. For instance, the new Meta Ray-Ban Display glasses, priced at $800, utilize advanced AI to facilitate intuitive interactions. This technology aims to enable users to obtain answers to questions seamlessly, enhancing everyday tasks without requiring direct communication with an AI agent.

The Simplified Path to AI Integration

Interestingly, Meta has managed to advance its smart glasses initiative without the typical extensive groundwork usually required for launching new technology. There has been no widespread developer outreach or significant investment in a library of applications, suggesting that the focus on AI and specific use cases has sufficed for the time being. This approach raises questions about the future of app development within Meta’s ecosystem, particularly as the company continues to explore innovative avenues without the traditional demands for a standout application.