Actroid is a type of android (humanoid robot) with strong visual human-likeness

Actroid is a type of android (humanoid robot) with strong visual human-likeness

Actroid is a type of android (humanoid robot) with strong visual human-likeness developed by Osaka University and manufactured by Kokoro Company Ltd. (the animatronics division of Sanrio). It was first unveiled at the 2003 International Robot Exhibition in Tokyo, Japan. Several different versions of the product have been produced since then. In most cases, the robot’s appearance has been modeled after an average young woman of Japanese descent.

The Actroid woman is a pioneer example of a real machine similar to imagined machines called by the science fiction terms android or gynoid, so far used only for fictional robots. It can mimic such lifelike functions as blinking, speaking, and breathing. The “Repliee” models are interactive robots with the ability to recognize and process speech and respond in kind.

The original Repliee Q1 had a “sister” model, Repliee R1, which is modeled after a 5-year-old Japanese girl.

More advanced models were present at Expo 2005 in Aichi to help direct people to specific locations and events. Four unique faces were given to these robots. The ReplieeQ1-expo was modeled after a presenter for NHK news. To make the face of the Repliee Q2 model, the faces of several young Japanese women were scanned and the images combined into an average composite face.

The newer model Actroid-DER2 made a recent tour of U.S. cities. At NextFest 2006, the robot spoke English and was displayed in a standing position and dressed in a black vinyl bodysuit. A different Actroid-DER2 was also shown in Japan around the same time. This new robot has more realistic features and movements than its predecessor.

In July 2006, another appearance was given to the robot. This model was built to look like its male co-creator, roboticist Hiroshi Ishiguro, and named Geminoid HI-1. Controlled by a motion-capture interface, Geminoid HI-1 can imitate Ishiguro’s body and facial movements, and it can reproduce his voice in sync with his motion and posture. Ishiguro hopes to develop the robot’s human-like presence to such a degree that he could use it to teach classes remotely, lecturing from home while the Geminoid interacts with his classes at Osaka University.

In May 2011 a Danish Lector, Henrik Schärfe, revealed a robotic version of himself. Manufactured in Japan and called a Geminoid-DK, its actions are controlled remotely by a person operating a computer, but it is programmed with Schärfe’s own unique body movements, such as shrugs and glances.

Qoobo is the cat like friendly pet robot

Qoobo is the cat like friendly pet robot

Qoobo is the cat like friendly pet robot

Qoobo is the cat like friendly pet robot

Qoobo makes us surprised by how realistic the tail moved, especially with its wagging intensity.

This is essentially a cushion with a realistic cat tail that reacts to stroking and patting, such that it’s able to comfort its “owner” like a real pet would simply through tail wagging. To make it more lifelike, Qoobo also wags its tail randomly when it is left alone for too long.

The idea of Qoobo originated from one of Yukai Engineering’s 20 employees in an internal competition about half a year ago. The designer had a cat, but since the apartment she moved into didn’t allow pets, she had to leave her cat with her parents. This inspired her to come up with a lifelike pet substitute that would make her feel better whenever she thought of her cat. And of course, this would double as a therapy robot for potentially treating depression and anxiety, as not everyone has access to a therapy cat or dog for various reasons — be it costs, allergies or the aforementioned apartment restrictions.

https://www.engadget.com/2017/10/04/qoobo-cat-tail-cushion-therapy-robot/

Tipron is a transforming robot projector that looks like a rolling eyeball

  

Tipron is a transforming robot projector that looks like a rolling eyeball

Tipron is a transforming robot projector that looks like a rolling eyeball

The Tipron, developed by Japanese smart device maker Cerevo, looks like a white eye on a sleek robotic stalk and wheeled base — a child’s friendly sidekick droid in some ’70s science fiction movie. It has one purpose: to move around your house and project things on walls. I assume there is a segment of the population that has always wanted a mobile robot projector; in fact, another one debuted two years ago. The rest of us can enjoy it for what it is: the prototypical weird CES gadget.

It’s hard to say how well the Tipron fulfills its intended purpose on a crowded show floor, especially given the bright lights and lack of a screen to show off the projection quality. It’s managed via a smartphone app that acts like a remote control for both the projector head and the robot itself. While moving, the Tipron folds up and slowly rolls wherever it’s directed. With a button tap, it extends into projection mode, where users can change the angle and keystone of an image.

Tipron can automatically project content you’d like to see such as movies or pictures on an 80 inch screen not only on a wall, but also on a floor or ceiling at any angle from a distance of 3 meters.

Using the built-in RSS reader function, it is easy to display information such as news, weather forecasts and Twitter feeds. The news can also be set to scroll automatically.

Automatic chargingTipron returns to it’s charging station automatically and starts charging himself after finishing all scheduled actions. He can also return to recharge when his battery is low*1.*1 Only supported when operating in scheduled mode.

NASA’s R5 aka Valkyrie was designed and built by the Johnson Space Center

NASA’s R5 aka Valkyrie was designed and built by the Johnson Space Center

NASA’s R5 aka Valkyrie was designed and built by the Johnson Space Center

NASA’s R5 aka Valkyrie was designed and built by the Johnson Space Center (JSC)

Engineering Directorate to compete in the 2013 DARPA Robotics Challenge (DRC) Trials. Valkyrie, a name taken from Norse mythology, is designed to be a robust, rugged, entirely electric humanoid robot capable of operating in degraded or damaged human-engineered environments. Building on prior experience from designing Robonaut 2, the JSC Valkyrie team designed and built this robot within a 15 month period, implementing improved electronics, actuators and sensing capability from earlier generations of JSC humanoid robots.

PHOTO DATE: 12-12-13
LOCATION: Bldg. 32B – Valkyrie Lab
SUBJECT: High quality, production photos of Valkyrie Robot for PAO
PHOTOGRAPHERS: BILL STAFFORD, JAMES BLAIR, REGAN GEESEMAN

Following the robot’s appearance at the 2013 DRC Trials, the Valkyrie team modified and improved the robot – modifying the hands to increase reliability and durability, redesigning the ankle to improve performance and upgrading sensors for increase perception capability. The Valkyrie team also partnered with the Florida Institute for Human and Machine Cognition (IHMC) to implement their walking algorithms on NASA hardware in preparation for the Space Robotics Challenge, part of NASA’s Game Changing Development Program and Centennial Challenges.

Power/Battery
Valkyrie can be configured to run from a wall or from battery power. The custom dual-voltage battery is capable of running the robot for about an hour. When a battery is not in use, it can be replaced with a mass simulator and capacitor that simulates the mechanical and some of the electrical properties of the battery.

Head/Sensor Suite
Valkyrie’s head sits atop a 3 DOF neck. The main perceptual sensor is the Carnegie Robotics Multisense SL, with modifications to allow for IR structured light point cloud generation in addition to the laser and passive stereo methods already implemented. Valkyrie also features fore and aft “hazard cameras” located in the torso.

Arms
Each upper arm consists of 4 series elastic rotary actuators and when combined with the forearm has 7 joints. The arm has a quick mechanical and electrical disconnect between the first two joints that allows for easy shipping and service.

Forearms/Hands
Valkyrie features a simplified humanoid hand, with 3 fingers and a thumb. Each forearm consists of a single rotary actuator (realizing the wrist roll), a pair of linear actuators (realizing wrist pitch and yaw), and 6 finger and thumb actuators. The hands are attached to the ends of the arms with mechanical and electrical quick disconnects that allow for easy shipping and service.

 

Torso/Pelvis
The robot’s torso houses two series elastic rotary actuators (the first arm joint on either side), two series elastic linear
actuators that work in concert to realize motion between the torso and pelvis, and various computer and power facilities. The pelvis houses three series elastic rotary actuators: the waist rotation joint, and the hip rotation joint of each leg. The pelvis is considered the robot’s base frame, and includes two IMU’s.

Legs
Each upper leg contains five series elastic rotary actuators. The ankle is realized using two series elastic linear actuators working in concert. The leg has a quick mechanical and electrical disconnect between the first two joints that allows for easy shipping and service.

Specifications
Weight: 300 pounds Height: 6 feet 2 inches Battery Energy: 1.8kWh Computers: 2 x Intel Core i7

KAIST Raptor, An Incredibly Fast Bipedal Robot by Dinosaurs

KAIST Raptor, An Incredibly Fast Bipedal Robot by Dinosaurs

Raptor is a bipedal robot which was designed and conceived in 2014 by the Korea Advanced Institute of Science and Technology (KAIST). It has a top speed of 28.58 miles per hour, making it the second fastest robot after the Cheeath, and the fastest bipedal robot worldwide. Designers at the KAIST took their inspiration from the Velociraptor, a bipedal dinosaur which balances itself with its tail. The robot moves itself with a pair of carbon / epoxy composite blade legs.

It has two under-actuated legs and a tail inspired by velociraptors. The Raptor robot runs at a speed of 46 km/h on a treadmill with off-board power. Tail-assisted pitch control provides stability over high obstacles.

BionicANTs -Cooperative behaviour based on natural model

  The BioANT is about the size of a human hand. (Image: Festo)

                   The BioANT is about the size of a human hand. (Image: Festo)

BionicANTs -Cooperative behaviour based on natural model

For the BionicANTs, Festo has not only taken the delicate anatomy of the natural ant as a role model. For the first time, the cooperative behaviour of the creatures is also transferred to the world of technology using complex control algorithms.

Highly integrated individual systems to solve a common task

Like their natural role models, the BionicANTs work together under clear rules. They communicate with each other and coordinate their actions and movements among each other. The artificial ants thus demonstrate how autonomous individual components can solve a complex task together working as an overall networked system.

            The BioANT is about the size of a human hand. (Image: Festo)

In an abstract manner, this cooperative behaviour provides interesting approaches for the factory of tomorrow. Future production systems will be founded on intelligent components, which adjust themselves flexibly to different production scenarios and thus take on tasks from a higher control level.

Latest production methods and technologies
Yet not only the cooperative behaviour of the artificial ants is amazing. Even their production method is unique. The laser-sintered components are embellished with visible conductor structures in the 3D MID process. They thereby take on design and electrical functions at the same time.

The BioANT is about the size of a human hand. (Image: Festo)

In the actuator technology used in the legs, Festo utilises the benefits of piezo technology. Piezo elements can be controlled very precisely and quickly. They require little energy, are almost wear-resistant and do not need much space. Three trimorphic piezo-ceramic bending transducers, which serve both as an actuator and a design element, are therefore fitted into each thigh. By deflecting the top bending transducer, the ant lifts its leg. With the pair underneath, each leg can be exactly deflected forwards and backwards.

MIT cheetah robot lands the running jump

MIT cheetah robot lands the running jump

Jennifer Chu | MIT News Office
May 29, 2015

In a leap for robot development, the MIT researchers who built a robotic cheetah have now trained it to see and jump over hurdles as it runs — making this the first four-legged robot to run and jump over obstacles autonomously.

To get a running jump, the robot plans out its path, much like a human runner: As it detects an approaching obstacle, it estimates that object’s height and distance. The robot gauges the best position from which to jump, and adjusts its stride to land just short of the obstacle, before exerting enough force to push up and over. Based on the obstacle’s height, the robot then applies a certain amount of force to land safely, before resuming its initial pace.

In experiments on a treadmill and an indoor track, the cheetah robot successfully cleared obstacles up to 18 inches tall — more than half of the robot’s own height — while maintaining an average running speed of 5 miles per hour.

“A running jump is a truly dynamic behavior,” says Sangbae Kim, an assistant professor of mechanical engineering at MIT. “You have to manage balance and energy, and be able to handle impact after landing. Our robot is specifically designed for those highly dynamic behaviors.”

Kim and his colleagues — including research scientist Hae won Park and postdoc Patrick Wensing — will demonstrate their cheetah’s running jump at the DARPA Robotics Challenge in June, and will present a paper detailing the autonomous system in July at the conference Robotics: Science and Systems.

Once the robot has detected an obstacle, the second component of the algorithm kicks in, allowing the robot to adjust its approach while nearing the obstacle. Based on the obstacle’s distance, the algorithm predicts the best position from which to jump in order to safely clear it, then backtracks from there to space out the robot’s remaining strides, speeding up or slowing down in order to reach the optimal jumping-off point.

Shimon a four-armed, marimba-playing robot

Shimon a four-armed, marimba-playing robot

Article by http://www.news.gatech.edu/2017/06/13/robot-uses-deep-learning-and-big-data-write-and-play-its-own-music.

A marimba-playing robot with four arms and eight sticks is writing and playing its own compositions in a lab at the Georgia Institute of Technology. The pieces are generated using artificial intelligence and deep learning.

Researchers fed the robot nearly 5,000 complete songs — from Beethoven to the Beatles to Lady Gaga to Miles Davis — and more than 2 million motifs, riffs and licks of music. Aside from giving the machine a seed, or the first four measures to use as a starting point, no humans are involved in either the composition or the performance of the music.

The first two compositions are roughly 30 seconds in length. The robot, named Shimon, can be seen and heard playing them here and here.

Ph.D. student Mason Bretan is the man behind the machine. He’s worked with Shimon for seven years, enabling it to “listen” to music played by humans and improvise over pre-composed chord progressions. Now Shimon is a solo composer for the first time, generating the melody and harmonic structure on its own.

“Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,” said Bretan, who will receive his doctorate in music technology this summer at Georgia Tech. “Shimon’s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.”

Bretan says this is the first time a robot has used deep learning to create music. And unlike its days of improvising, when it played monophonically, Shimon is able to play harmonies and chords. It’s also thinking much more like a human musician, focusing less on the next note, as it did before, and more on the overall structure of the composition.

“When we play or listen to music, we don’t think about the next note and only that next note,” said Bretan. “An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole.”

Shimon was created by Bretan’s advisor, Gil Weinberg, director of Georgia Tech’s Center for Music Technology.

“This is a leap in Shimon’s musical quality because it’s using deep learning to create a more structured and coherent composition,” said Weinberg, a professor in the School of Music. “We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring and strange.”

Shimon will create more pieces in the future. As long as the researchers feed it a different seed, the robot will produce something different each time — music that the researchers can’t predict. In the first piece, Bretan fed Shimon a melody comprised of eighth notes. It received a sixteenth note melody the second time, which influenced it to generate faster note sequences.

The da Vinci Surgical System is a robotic surgical system

  

The da Vinci Surgical System is a robotic surgical system

The da Vinci Surgical System is a robotic surgical system made by the American company Intuitive Surgical. Approved by the Food and Drug Administration (FDA) in 2000, it is designed to facilitate complex surgery using a minimally invasive approach, and is controlled by a surgeon from a console. System is commonly used for prostatectomies, and increasingly for cardiac valve repair and gynecologic surgical procedures.

According to the manufacturer, Da Vinci System is called “da Vinci” in part because Leonardo da Vinci’s “study of human anatomy eventually led to the design of the first known robot in history.”

Da Vinci Surgical Systems operate in hospitals worldwide, with an estimated 200,000 surgeries conducted in 2012, most commonly for hysterectomies and prostate removals. As of September 30, 2016, there was an installed base of 3,803 units worldwide – 2,501 in United States, 644 in Europe, 476 in Asia, and 182 in the rest of world.The “Si” version of the system costs on average slightly under US$2 million, in addition to several hundred thousand dollars of annual maintenance fees. DA Vinci system has been criticised for its cost and for a number of issues with its surgical performance.

Erica Robot – Worlds First TV News Anchor in Japan

 Erica Robot - Worlds First TV News Anchor in Japan

Erica Robot – Worlds First TV News Anchor in Japan

Erica Robot – Worlds First TV News Anchor in Japan

Her voice may also be used to talk to passengers in autonomous vehicles, writes Wall Street Journal.

Erica was developed with money from one of the highest-funded science projects in Japan, JST Erato.

Although she is unable to move her arms, she can work out where sound is coming from and knows who is asking her a question.

Using 14 infra-red sensors and face recognition technology, she can track people in a room.

Erica’s ‘architect’ Dr Dylan Glas, says the robot has learned to tell jokes, ‘although they’re not exactly side-splitters’, he added.

‘What we really want to do is have a robot which can think and act and do everything completely on its own’, he said.

This isn’t the first time Ishiguro has created a robot newsreader.

In 2014, Dr Ishiguro unveiled ultra-realistic robot news anchors called Kodomoroid and Otonaroid at a Tokyo museum.

Hiroshi Ishiguro, director of the Intelligent Robotics Laboratory at Osaka University, and the creator of the humanoid says he’s been trying to get her on air since 2014.

It’s not known what network she will appear or during what time period. She will also lend her voice for in-car interactions for self-driving vehicles.

Read more: http://www.dailymail.co.uk/sciencetech/article-5328821/Erica-robot-life-like-soul.html#ixzz55ylL1AAj
Follow us: @MailOnline on Twitter | DailyMail on Facebook