Dystopic - The Machine Age Part 2 – What Makes The New Machine?
Published 28 days ago • 18 min read
December 12, 2025
Dystopic Newsletter
The Machine Age Part 2 – What Makes The New Machine?
A look at the components and costs of Humanoid Robots
Humanoid Robot (Bonsystems)
What Makes the New Machine?
A new era of humanoid robotics, enabled by Physical AI, is dawning.
Physical AI is the driving force behind solutions spanning self-driving taxis, military drone swarms, missile defense, augmented reality, and industrial and humanoid robotics. If you think Large Language Model (LLM) AI is going to have an impact on your life and work. You’re probably right, it is. Companies like OpenAI, ChatGPT, Google DeepMind, and Anthropic, to name a few, are working tirelessly to achieve machine understanding of language, speech, and symbolic information: General Intelligence.
Physical AI is following directly behind LLM AI, ushering in the Machine Age. It's not science fiction; it is a scientific fact, and it will fundamentally change our world.
This is the second in a three-part series of Dystopic newsletters about “The Machine Age”, exploring the technology and advancements behind a revolution in humanoid robotics. The first Dystopic in the series, Machine Age Part 1 – Introduction to Physical AI, explored the motivation to build humanoid robots and the advances in AI that are making what was once science fiction a reality. To summarize briefly:
Why Humanoid Robots? The world’s infrastructure has been designed for humans who walk on two legs, have two arms, and have two 10-digit, highly dexterous hands. The focus on general-purpose humanoid robots enables a single mass-produced product to perform any work a human can. In short, economies of scale, one product with a world’s worth of applications
Why Now? We have reached a technology and economic inflection point that puts both the computing and mechanical components of a robot to a point of economic viability. AI computing and memory follow “Moore’s Law,” which states that computing capabilities double and costs halve every 2 years. Nvidia (Groot) and Tesla (A-series ) have just released chipsets and computer hardware capable of driving humanoid robots today. Tesla predicts its $130K Optimus robot will fall to $20k a unit in the early 2030s – Robots for everyone!
Physical AI Evolution. Unlike Large Language Model AI, which acts on written/spoken language and in symbolic knowledge, Physical AI must create a world model of its surroundings, have a physical-world understanding of physical limits (weight, pressure, location, motion, etc.), operate in real-time, and be capable of reasoning and planning. Physical AI uses a VLA (Vision-Language-Action) model to enable robots to navigate and operate in the world.
The hype about “humanoid” robots is all well and good, but how are we actually going to build these “humanoid” robots? How can they possibly be cost-effective? What is the actual state of human robotics technology? That is the subject of part 2 of this series:
“What Makes the New Machine”A look at the components and costs of Humanoid Robots
It all starts with a BoM, the bill of materials, the sub-systems, and components that make up our humanoid robot. So let’s start there.
A Robot is the Sum of Its Parts
This may surprise you: a humanoid robot has far fewer components than a self-driving electric vehicle (EV) like a Waymo or Tesla taxi. An EV is a full-sized vehicle designed for transportation and safety, comprising thousands of parts that form the complete physical structure, interior, and safety systems for carrying passengers. A humanoid robot, in its current state, is much smaller and simpler in overall structure, though highly complex in its joint mechanics, especially its dexterous hands.
The Humanoid Robot vs self-driving Electric Vehicle Analogy (P Struhsaker, Tesla, Waymo)
Despite their differences, EVs and humanoid robots share 70% of their technology, according to Xpeng, a Chinese EV/robotics manufacturer. The same technology-sharing applies to Tesla and Optimus. Since we are all familiar with EVs, let's begin our discussion with the EV-humanoid robot comparison analogy, as outlined in the following table:
Feature Comparison of Humanoid Robot and Self-driving EV
Self-driving EVs are designed to carry human passengers safely for hundreds of miles. EVs need powerful batteries and electric motor drive trains to achieve this goal. Humanoid robots use joited limbs powered by actuators for movement and dextrous hands with a focus on fine control and positioning. Unlike robots, cars have thousands of parts designed to carry and protect human passengers and cargo weighing 1000 lbs or more. Robots can carry themselves and roughly 100 to 200 pounds of baggage ( lift capacity) over short distances. That's where the differences end.
Both solutions share sensors and use Physical AI to understand and operate in real-world environments, handling external events and motion in real time. The sensing and processing capabilities of these two platforms are very similar. It is no wonder the EV self-driving car companies like Xpeng in China and Tesla in the United States have a leg up on robotics companies with no background or investment in self-driving cars.
With this analogy in mind, let's take a deeper look at the exact components that make up a humanoid robot.
Components Subsystems of a Humanoid Robot (P Struhsaker, BoA, Cintrini Research)
All robots, including humanoid robots, are composed of a set of subsystems that include:
Actuation:The robotic equivalent of muscles that physically move the joints and digits in the hand of humanoid robots. Human robots include both rotary and linear actuators that move the joints, and in the case of the robotic dexterous hand, linear screws that flex wire tendons. According to ResearchGate, Tesla’s Optimus Gen 3 robot uses 28 structural actuators for the torso, shoulders, hips, wrists, elbows, ankles, and knees. An additional 12 actuators, 6 per hand, control Optimus Gen 3 dexterous hands. We'll dive deeper into actuators and dextrous hands later in this newsletter
Perception:The robot's senses for sight, balance, and positioning of its body and its position within the world. Cameras and LiDAR (Light Detection and Ranging) provide the robot and its surroundings with depth and position information. IMUs, or inertial measurement units, provide the robot's body with a sense of balance. Force sensors at both the hands and feet provide the robot with a sense of the pressure it exerts when its limbs contact objects in the physical world.
Mechanical:The bone structure and covering (skin) of our robot. The robot's chassis, on which all the other components are mounted, and its protective cover.
Power:The robot's energy storage, in the form of batteries, and systems to distribute power to the robot’s components. Lithium-ion battery packs and battery chargers form the heart of the power system. A power distribution board, which includes voltage regulators and converters, powers the computer, actuators, and all other electronics within the robot, along with fuses and circuit Breakers to protect the electronics from overload.
Compute and Control:The brains and nervous system of our robot, along with the intelligence (software) to direct the robot’s conscience (high-level tasks) and unconscious (low-level autonomous) movements. The main computer board serves as the robot's “brain, directing its tasks, while real-time controllers (embedded microcomputers) operate its limbs and digits. The IMU (Inertial Measurement Unit), visual and tactile sensors, all feed into the main computer. The Robotic Operating System (ROS) software processes this data and converts it into the actions the robot takes, moment by moment, to achieve its tasks.
Communications and I/O:The ears and mouth of our robot that enable communication with it, as well as the electronic interfaces for device updates. Speaker and microphones for audio communications and background noise cancellation. Wi-Fi and cellular modems to connect to the cloud for extended data queries, instructions, software downloads, and security patches. Electronic buses (CAN or Ethernet) serve as the nervous system, connecting all hardware across the robot. Finally, a wired interface (USB-C, Ethernet, etc.) to hardwire our robot to a PC, tablet, smartphone, or a hard-line internet connection.
That was a fairly technical answer. Sometimes a picture is worth a thousand words. Here is a graphic analogy of the sub-system of a humanoid robot body and a human body, which will provide an alternate perspective:
Human Body vs Humanoid Robot Body – An Analogy (Morgan Stanley)
Now that we have a basic understanding of the systems and components that make up a humanoid robot, it's time to look at its cost and cost trends.
Why? Unless you want to consider humanoid robots as vanity projects for nation-states (China, US, UK, etc.) or wealthy industrialists (Elon Musk, etc.) Human Robots must reach price points and productive capabilities to operate profitably; otherwise, why produce them?
The Cost of the New Machine – Today and Tomorrow
Jim Rauf, a professor of the University of Cincinnati, conducted a cost survey of humanoid robots in the spring of 2025. His focus was on full-featured, fully human-capable robot designs and their costs (mid-2024 to 2025). Here is a summary:
Tesla: Optimus $30,000 - $100,000, with Tesla's stated goal of $20,000 in the 2030 time frame. Note: Morgan Stanley has a detailed cost breakdown of Optimus Gen 2 at $50k to $60k, which we will discuss later
Figure 1: Figure 1 $100,000
Agility Robotics: Agility Digit $250,000
Unitree Robotics: Unitree H1 $90,000
Xpeng: G1 EDU (Engineering Development Unit) Ultimate D $73,900 (Note: you can find the price G1 Ultimate D HERE)
At a more detailed level, Morgan Stanley has broken down the costs of Tesla Optimum Gen2 as follows:
Head, $2.1K (vision and compute/AI processor)
Shoulders, $7.8k (6 rotary actuators)
Upper Arm, $1.1k (2 linear actuators)
Elbow, $7.8k (2 rotary actuators)
Forearm, $2.2k (4 linear actuators)
Waist & Pelvis, $7.8K (6 rotary actuators)
Thigh, $7.3k ( 4 linear actuators)
Hands, $9.5K (12 actuators and 2 x 6D force sensors)
Calf, $7.6k (6 linear actuators)
Skeleton, and miscellaneous $0.5k
Feet, $6.7k (2 x 6 dimension force sensors)
This totals just over $60K. Fun fact: It’s not computation (AI), vision, or other sensors that drive the cost of our humanoid robot; it’s the actuators. Smart actuators, including motors, reduction gears, screws, bearings, encoders, and force sensors, account for 90% of the cost of our humanoid robot. Here is our graphical breakdown of costs by parts and by function:
BoM Breakdown of a Humanoid Robot by Part Type (Morgan Stanley)
BoM Breakdown of a humanoid Robot by Function (Morgan Stanley)
Given that actuator costs dominate the BoM (Bill of Materials), we will examine actuators in greater detail. However, before we do, some final comments on cost and cost reductions
Joseph Stalin once observed that “Quantity has a Quality all its own.”
We can extend his observation point to the evolution of Humanoid robotics.
Humanoid robots in 2025 are in the R&D (Research and Development) / EDU (Engineering Development Stage). Costing more than most consumers' cars, and still under development, current humanoid robots are not economically viable. However, Financial analysts project that mass-production costs will decline rapidly with volume. Some analysts, Morgan Stanley included, believe there will be a distinct supply chain cost difference, nearly two-to-one, between the US/”Advanced Economy” and China.
Financial analysts use the current EV price differential between the Chinese and Western automobile markets and project it into the robotics supply chain. However, this may be a false narrative/assumption; the current Chinese EV market is experiencing price dumping due to oversupply. Too many car makers are producing too many cars, so there is a “race to the bottom” in pricing to move inventory. Secondly, the Chinese EV market has yet to close the gap in quality and longevity of their products.
Certainly, Elon Musk, who is centering his entire supply chain in the US, disagrees with this analysis. Musk has stated on numerous occasions that Optimus would reach a selling price of $20K to $25K (i.e., cost to build plus profit) by 2040. This is roughly half the cost of an Electric Vehicle and represents a price point Tesla believes would be attractive to consumers. To hit that price point, Optimus was designed for mass production as a starting point.
The following diagram provides the projected ASP of a humanoid robot over time. The projects reflect Morgan Stanley’s two-tier cost model, Tesla’s ASP projections, and the “magic” $25K price point estimates.
Humanoid Robot Average Sales Price Projection (Tesla, Morgan Stanley)
Regardless of which projected cost model you choose, by 2040, Human Robots will reach a cost point at which most forms of human labor can be profitably replaced. What does that mean for us humans? That will be the subject of Part 3 of this series - the human impact of this New Machine Age
Now let’s look at those actuators …
Actuators: the Strength Behind the New Machine
Th invention of the wheel (~4000 BCE) is widely regarded as one of the most significant technological inventions in history. Horse-drawn carriages, roads (think Rome), railroads, and automobiles followed from this humble creation.
Actuators evolved at the dawn of the Industrial Revolution (late 1700s), using steam to drive a piston, creating rotational energy, and, through gearing, converting that energy into pneumatic pressure for stamping steel or powering the brakes of the steam railroad engine.
The electric linear actuator as we know it today was invented by Bent Jensen in 1979. Jensen was a Danish businessman who sought to develop an idea to save his failing agricultural business. One day, whilst talking to an old friend who had a disability, Bent conceived the idea of developing an electric actuator to adjust the wheelchair.
Little did Jensen realize that his invention would kick off a robotics revolution less than 20 years later.
Actuators come in two types:
Linear Actuator: drive gearing that can extend or retract a central screw (IN and OUT)
Rotational Actuator: drive gearing that rotates around a central point (AROUND)
To illustrate the point, click HERE to watch a brief video of a rotational and a linear actuator in action.
Short Demo of both a Rotational and Linear Actuator (Oriental Motor)
It is a bit more complex than that, of course. A modern electric actuator system contains the following parts:
Screw: a mechanical component that converts motor-end rotary motion into linear motion. Humanoid robots planetary roller screws because they offer higher load capacity, superior rigidity, and longer lifespan.
Reducer: a form of gearing to reduce motor speed and improve the torque output and motion accuracy of a humanoid's joints. Human robots mainly use harmonic and planetary reducers.
Motor: An electric motor generates driving torque and is installed on the joint of the humanoid to control motion
Note: an Electric Motor consists of a Stator, the stationary part, and the Rotor. Combinations of magnets and winding on the stator and the Rotor create circular motion when electrical current is applied.
Sensor: Actuators use force sensors that convert force magnitude into an electrical signal, which is used by the actuator controller computer (i.e., encoder). Other sensors include linear position and rotation position sensors converted to electrical signals to he encoder.
Encoder: Encoders (control computer) are connected to the motor to monitor its status and send feedback to the actuator, which aggregates, analyzes, and corrects the signal to precisely control output variables such as actuator position, speed, and torque.
Bearing: Outer support part of our actuator. It ensures rotary precision by primarily supporting the mechanical rotary and by reducing friction to maintain its accuracy.
The following diagrams illustrate the components of the Linear and Rotational Actuator used in a Tesla Optimus human robot:
The Anatomy of Linear and Rotary Actuators used by Tesla Optimus (Tesla)
"Traditionally, an actuator is composed of a motor and a reducer. However, in recent years, “smart actuators” have emerged that integrate not only motors and reducers but also drivers and sensors such as torque sensors and encoders. This advancement has made actuators a central trend in AI robotics, transforming them from simple motion-generating parts into comprehensive motion solutions that include sensing and control capabilities.”
There are some practical differences between linear and rotary actuators outlined in the thet following table. Linear actuators are more efficient, they are self-locking( requiring no break pad and breaking energy to hold their position), and they are more durable. Unfortunately, linear actuators are also more costly ( when something is good, there is always a catch!)
Comparison of Rotational and Linear Actuators
Regardless of the actuator type, further improvements are needed:
The life cycle, the total hours of robot operating hours before the actuator wears out,
The hours of continuous use of actuators before they overheat or we run out of energy
The hours of operation on a single charge for the onboard battery/power system. For a factory robot, a full 8 hours (i.e., a shift) would be ideal.
Our humanoid robot’s components and subsystems are improving and becoming more cost-effective. However, it will take several years, possibly until 2030 or longer, for a viable commercial robot with the life cycle, hours of continuous operation, and a battery capacity expected from the market.
Now that you have a grasp of the “nuts and bolts” of an actuator, let's take a look at one of the most impressive displays of actuators … the robotic hand.
It’s All In the Hands and Feet
The benchmark for a robotic hand is the human hand. The human hand has roughly 27 distinct movements, called degrees of freedom (DoF). The human hand uses a combination of tendons in the forearm (which route through the “Carpal Tunnel”), and 20 or so muscles in the hand for a sense of position and fine motor control.
The major challenge in humanoid robotics is developing a Minimum Viable Solution (MVS) for a robotic hand. What is Good enough? Enter the Tesla Optimus Gen 3 Hand...
Tesla Optimus Gen 3 Hand with 22 Degrees of Freedom (Tesla)
The Optimus Gen 3 hand design is inspired by the human hand and employs a mechanical tendon system. To get an idea of the Gen 3’s complexity, check out this 60-second video HERE. It is impressive
The Gen 3 provides 22 degrees of freedom (DoF) hand activation.
Finger: 4 Fingers with 4 DoF on each finger (total of 16 DoF)
Thumb: 1 Thumb with 5 DoF
Wrist: 1 Wrist with 1 DoF
Total: 22 DoF compared to the human hand which has 27 DoF
The joint angles of the hands match human ranges exactly:
Finger flexion: 0-90 degrees at each joint
Thumb opposition: 180-degree arc across palm
Wrist rotation: ±90 degrees pronation/supination
Wrist flexion: ±70 degrees up/down movement
Each fingertip contains multiple force sensors:
Normal force sensors: Detect pressing/squeezing forces
Shear force sensors: Detect slipping or lateral forces
Texture sensors: Identify surface roughness and materials
Temperature sensors: Prevent handling hot/cold objects unsafely
A highly detailed description of the Optimus Gen 3 Hand and finger design is available in Tesla’s hand and finger mechanism patent, WO2024073138A1, and in a preliminary patent, US20100259057A1 is also avaliable. It’s rather dry reading but very interesting if you are a wonky engineering type like me, or you're just interested.
Lex Fridman, a research scientist at Massachusetts Institute of Technology whose research focuses on human-robot interaction, autonomous vehicles, and machine learning, interviewed Elon Musk in his “Lex Fridman Podcast” for a detailed discussion on the design of the new Gen 3 Optimus Hand. Very informative, view the interview HERE.
Lex Fridman interviews Elon Musk on Optimus Gen 3 Hand Design (Lex Fridman)
Tesla engineers didn’t stop at the hands, they wanted the Optimus to walk in the real world. You can watch a video of Optimus walking uphill and hiking through rough natural terrain HERE. h
Tesla Engineers achieved improved balance and full-body control through foot force/torque sensing and human foot geometry, incorporating articulated toe sections. Based on Tesla’s foot patent WO2017068037A1, Optimus has at least 3 pressure sensors to measure foot placement on a surface. Two pressure sensors are on the left and right sides of the articulated toe section. A single pressure sensor in the heel of the foot.
If you think about it, we understand how steady and sure of footing we have when we walk by the pressure we feel across our toes; it should be even. If there is a significant pressure imbalance, we are likely slipping or have an unstable footing, and our bodies adjust our center of gravity to prevent falling. Optimus is capable of this exact behavior.
Tesla Optimus 2 degree of freedom ankle and forefoot (Tesla)
Clearly, evolutions created structures, especially in our hands and feet, that are ideal for robot locomotion in the new machine age
The Android – A Human Robot
Loneliness is a growing problem for the aging and disabled. The US National Institutes of Health has shown a significantly raised risk of premature death, with some studies showing increases of 26-45% attributed to loneliness.
Humanoid care robots are viewed as a potential solution. Armed with LLM AI for conversation and Physical AI to handle chores, prepare meals, and assist in any way needed, household humanoid robots are seen as an obvious solution for a large and growing market.
The elderly need a friendly, helpful, approachable, human-like robotic caregiver and companion.
Enter the UK robotics company, Engineered Arts, and their robotic prototype of an Android, a human-like head and face, called Ameca. Engineered Arts is the premier robotics company focusing on human-like facial expression. The company advertises itself as “The Global Leader in Social Humanoid Robotics.”
Engineered Arts Ameca - Social Humanoid Robot (Engineered Arts)
AMECA:
32 DoF in the Head & Neck
27 degrees of freedom animating the eyes, lips, and other facial features, plus 5 degrees of freedom in the neck, allow Ameca to perform a wide range of facial expressions and gestures.
Ameca's head and neck are designed to move fluidly and naturally, allowing for a wide range of gestures and interactions. Ameca can smile, frown, raise its eyebrows, and perform other subtle facial movements that mimic human expressions and gestures. Each degree of freedom is powered by advanced motors and servos that enable precise control over movements, and can be customized using our web-based software platform, Tritium.
[The details ... ]
2 DoF – eyebrow (x2)
4 DoF – eye (x2)
1 DoF – nose
12 DoF – lips
2 DoF – jaw
5 DoF – neck
Much like robotic hands, providing a human-social interface based on the mechanics of the human face.
Ameca is something you have to see to For Example, take a look at this Bloomberg “interview” video of Ameca HERE. In some ways, an Android social interface to our human robots, while more approachable than the faceless, high-end human robots like Figure 1 or Optimus, is “a bit creepy.” What do you think?
Closing Thoughts
A technological and commercial race is intensifying worldwide to deliver functional, cost-effective, and reliable humanoid robots to the market.
Central to all humanoid robotics solutions are electromechanical actuators that drive movement and locomotion. Like the AI and computer silicon chips driving LLMs and Physical AI, actuators are undergoing their own cost-reduction and quality revolution.
By all measures, we are heading for a reckoning in the 2040 time frame: the point at which humanoid robots can economically replace humans for the vast majority of tasks.
What happens to us, the human race, when humanoid robots arrive? That is the topic of the third and final newsletter on this subject: The Machine Age Part 3 - the Human Factor
The "Fortress Belt" is a heavily fortified set of cities in Donest Oblast, Ukraine, which includes cities of Slovyansk, Kramatorsk, Kostyantnivka, and Pokrova.
Ukraine's Fortress Belt– the sticking point to any negotiated peace (ISW – Institute for the Study of War)
The Noose Tightens on Venezuela
U.S. forces escalated tensions with Venezuela. As a Reuters new report notes, “The U.S. has seized a sanctioned oil tanker off the coast of Venezuela, President Donald Trump said on Wednesday, a move that sent oil prices higher and sharply escalated tensions between Washington and Caracas.”
Will it take US forces on the ground to oust Venezuelan strongman Maduro? Stay tuned …
National Security Strategy of the United States of America
The U.S. Department of War has released its 2025 National Security Strategy. The document is released every 4 years, regardless of who is the President, to provide a guide to the priorities the current President will focus on during the remainder of their 4-year term.
Just Release the National Security Strategy of the United States of America (DoW)
The document defines what is meant by “Strategy” and proceeds to ask two questions:
What Should the United States Want?
What Do We Want In and From the World?
The document then presents the complete strategy: Principles, priorities, and impact on world regions.
What I thought was most interesting to Dystopic readers, and I have paraphrased, is the second question …
What Do We Want In and From the World?
Western Hemisphere: We want to ensure that the Western Hemisphere remains reasonably stable and well-governed enough to prevent and discourage mass migration to the United States. We want a Hemisphere that remains free of hostile foreign incursion -In other words, we will assert and enforce a “Trump Corollary” to the Monroe Doctrine
Indo-Pacific: We want the region to be free and open, preserving freedom of navigation in all crucial sea lanes, and maintaining secure and reliable supply chains and access to critical materials
Europe: We want to support our Western Allies while restoring Europe’s civilizational self-confidence and Western identity (read that as self-reliant)
Middle East: We want the region to be free of outside adversarial powers (read that as Russia, China, and Iran) and ensure free flow of energy and goods while maintaining peace without further wars.
Africa: Interestingly, nothing is said about Africa – outside of the Middle East – it appears Africa does not factor in US plans except for economic development.
If you are European, you may find the arts of this document somewhat insulting as it continues to promote President Trump's rhetoric concerning the “Decay of Europe.”
Putting that aside, I strongly recommend reading the 33-page document to understand the significant changes outlined in the UIS defense poster. You can read the complete HERE. I strongly recommend it.
That’s a wrap for this week …
Dystopic- The Technology Behind Today's News
Thank you for your readership and support. Please recommend Dystopic to friends and family who are interested, or just share this email.
Not on the Dystopic mail list? New Readers can sign up for Dystopic HERE
If you have missed a Dystopic News letter, you can find select back copies HERE https://paul-struhsaker.kit.com/profile
Finally, pick up a copy of my book or listen to the new audiobook:
How The Hell Did We Get Here? A Citizen's Guide to The New Cold War and Rebuilding of Deterrence
Available on Amazon USA HERE, Amazon Internationally (on your local Amazon page), or through Barnes & Noble and other major retailers online