Meta (Facebook)
Canvas Category Software : Engineering : Metaverse
When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology.
Assembly Line
Textbooks come alive with new, interactive AI tool
With just an iPad, students in any classroom across the world could soon reimagine the ordinary diagrams in any physics textbook—transforming these static images into 3D simulations that run, leap or spin across the page. These new, living textbooks are the brainchild of a team of computer scientists led by Ryo Suzuki at CU Boulder.
“Usually, those diagrams are fixed. We have to imagine what happens,” said Suzuki, assistant professor in the ATLAS Institute and Department of Computer Science. “But what if we could take any static diagram from any textbook and make it interactive?”
He and his colleagues recently took home a “best paper” award for their work at the 37th Annual ACM Symposium on User Interface Software and Technology this October in Pittsburgh.
The tool relies on a model called Segment Anything from the tech company Meta. It’s a computer visualization tool that allows users to click on a photo to isolate particular objects—a dog, or maybe a face. Similarly, through Augmented Physics, students and teachers select various objects inside a diagram, such as the skier and the ski jump, and assign those objects roles. The AI then applies some basic physics, such as the force of gravity, to make those objects move.
Simulator-based reinforcement learning for data center cooling optimization
Most of Meta’s existing data centers use outdoor air and evaporative cooling systems to maintain environmental conditions within the envelope of temperature between 65°F and 85°F (18°C and 30°C) and relative humidity between 13 and 80%. As water and energy are consumed in the conditioning of this air, optimizing the amount of supply airflow that has to be conditioned is a high priority in terms of improving operational efficiency.
Since 2021, we have been leveraging AI to optimize the amount of airflow supply into data centers for cooling purposes. Using simulator-based reinforcement learning, we have, on average, reduced the supply fan energy consumption at one of the pilot regions by 20% and water usage by 4% across various weather conditions.
🖨️ How Will The Apple Reality Pro Headset Boost 3D Printing?
While most AR/VR companies certainly rely on 3D printing to some extent, at least at the level of product design, Apple’s latest product, specifically, may kickstart a niche segment of the industry known as “additively manufactured electronics (AMEs).” To those who have been following the 3D printing industry, the most obvious method for squeezing electronics into small spaces is to use AMEs. With 3D printing, it’s possible to spray conductive traces onto curved surfaces using a technology called Aerosol Jet, from Optomec, which allows electronic features to be incorporated into the structure of a product, rather than force entirely separate components into already tight spaces.
The Sandia National Labs spinout has sold Aerosol Jet printers to Google, Meta, Samsung and has all-but-confirmed that Apple is using the process, as well. By 2016, Taiwanese manufacturer Lite-On Mobile used these systems to spray antennas onto millions of mobile phones before its then-senior manager of Technology Development for Antennas, Henrik Johansson, left to work for Apple.
However, it isn’t Aerosol Jet alone that may be used by these companies to shrink devices. In December 2022, Meta acquired optics firm Luxexcel with a goal of using its lens printing process to create AR glasses. Luxexcel’s method produces optically clear polymers with the ability to integrate waveguides, necessary for transparent displays, into its lenses. It’s no coincidence then that the social media-turned-metaverse giant will be releasing the newest version of its Quest Pro headset late this year, a device said to rival Apple’s Reality Pro.
⭐ Hunting For Hardware-Related Errors In Data Centers
The data center computational errors that Google and Meta engineers reported in 2021 have raised concerns regarding an unexpected cause — manufacturing defect levels on the order of 1,000 DPPM. Specific to a single core in a multi-core SoC, these hardware defects are difficult to isolate during data center operations and manufacturing test processes. In fact, SDEs can go undetected for months because the precise inputs and local environmental conditions (temperature, noise, voltage, clock frequency) have not yet been applied.
For instance, Google engineers noted ‘an innocuous change to a low-level library’ started to give wrong answers for a massive-scale data analysis pipeline. They went on to write, “Deeper investigation revealed that these instructions malfunctioned due to manufacturing defects, in a way that could only be detected by checking the results of these instructions against the expected results; these are ‘silent’ corrupt execution errors, or CEEs.”
Engineers at Google further confirmed their need for internal data, “Our understanding of CEE impacts is primarily empirical. We have observations of the form, ‘This code has miscomputed (or crashed) on that core.’ We can control what code runs on what cores, and we partially control operating conditions (frequency, voltage, temperature). From this, we can identify some mercurial cores. But because we have limited knowledge of the detailed underlying hardware, and no access to the hardware-supported test structures available to chip makers, we cannot infer much about root causes.”