http://WWW.ROBOTICS4U.IN
ROBOTICS 5ab356e70402d10aa42866f8 False 36 2
OK
background image not found
Found Update results for
'rock relative'
9
Ex-NASA Engineer Made the Perfect Rock Skipping Robot Ex-NASA engineer and YouTube inventor Mark Rober have made a perfect rock-skipping robot. Not only can the robot perform impressively, but it can help you learn how to skip rocks better too. Rober built the robot by tweaking a clay pigeon thrower, creating wooden custom throwing arms and a base for stability. Once he built a prototype, his team of assistants (nieces and nephews) gave Skippa, the rock-throwing robot makeover with spray paint and giant googly eyes, and then brainstormed test variables for a perfect skip. How do you achieve the perfect rock skip? The team narrowed it down to four variables: the wrist angle of the robot (the angle of the rock relative to the water), the arm angle of the robot (which changes the path of the rock), and the rocks used (variations in diameter and thickness). To create uniform controls for robot tests, Rober and his team made their own rocks out of unfired clay (the clay discs easily dried in the sun, and dissolved in water under 30 mins). After the robot tested some unsuccessful skips, it began to shoot rocks tumbling across the water in over 60 skips per throw. Here’s the recipe Rober finally found for the perfect rock skip: the rock should hit at a 20-degree angle to the water, with a 20-degree path, and a higher throw for more energy. Flicking the wrist as much as possible will help the rock spin, which will help the rock stable. And finally, the most important factors for rock selection is a flat bottom and finding a rock that’s heavy but not too big to handle. When Rober’s amateur engineering team tested the principles they learned from the robot, they were quickly able to improve their skips from an average of three to 16 skips. Content gathered by BTM robotics training centre, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP Nagar, robotics training centres in Bannerghatta road, robotics training centres in JP Nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore.
Firefighting Robot Snake Flies on Jets of Water. Using steerable jets of water like rockets, this robot snake can fly into burning buildings to extinguish fires. Fires have an unfortunate habit of happening in places that aren’t necessarily easy to reach. Whether the source of the fire is somewhere deep within a building, or up more than a floor or two, or both, firefighters have few good options for tackling them. They can either pour water into windows (which doesn’t always work that well), or they can try and get into the building, which seems like it’s probably super dangerous. At the International Conference on Robotics and Automation last month, researchers from Tohoku University and National Institute of Technology, Hachinohe College, in Japan, presented a new kind of snake-like robot with the body of a fire house. Like other snake robots, this one has the potential to be able to wiggle its way into windows or other gaps in a structure, with the benefit of carrying and directing water as it goes. What’s so cool about this particular design, though, is how it powers itself: By firing high pressure jets of water downwards like rocket engines, it can lift itself off of the ground and fly. What’s happening here might be complex to implement in practice, but in principle, it’s not too complicated: There are sets of steerable nozzle modules distributed along the length of the hose. These modules siphon water out of the high pressure stream inside of the hose, and spray it downwards. As the water exits downwards at high velocity, it pushes the hose upwards, and with enough of these modules squirting out high pressure water, the entire hose can be lifted into the air. Just like a rocket, it’s not dependent on ground proximity to work, so as long as you keep on giving it more hose and water at a high enough pressure, it’ll go as high as you want. Since the nozzles are steerable, each module can direct itself independently, letting the hose weave itself through small gaps deep into a structure in order to find the source of a fire. And the “head” module comes with a few extra degrees of freedom to allow the water stream to be directed more precisely. And of course, while the head nozzle is fighting the source of the fire, a byproduct of the body of the house keeping itself airborne is that it’s drenching everything that it’s passing over, while also keeping itself cool. The 2-meter long prototype in the video above is intended to be a single segment in a robot that can be extended to an arbitrary length by just adding on more segments. A gas engine powered a compressor that provided water at 0.7 MPa. It worked reasonably well, as prototypes go, but it’s really more of a proof of concept in hardware than anything else, and obviously there’s a lot to do before a system like this could be real-world useful. The researchers readily admit that their current control algorithms are “not sophisticated, ” and that they’ll need to put some work into making it more stable, more controllable, and able to handle more modules. They’re actively working on it, though, and we’re looking forward to this tech being adapted to garden hoses as well. Content gathered by BTM robotics training center, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP nagar, robotics training centers in Bannerghatta road, robotics training centers in JP nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore,
Experimental drone uses AI to spot violence in crowds. Whether or not it works well in practice is another story. Drone-based surveillance still makes many people uncomfortable, but that isn't stopping research into more effective airborne watchdogs. Scientists have developed an experimental drone system that uses AI to detect violent actions in crowds. The team trained their machine learning algorithm to recognize a handful of typical violent motions (punching, kicking, shooting and stabbing) and flag them when they appear in a drone's camera view. The technology could theoretically detect a brawl that on-the-ground officers might miss, or pinpoint the source of a gunshot. As The Verge warned, the technology definitely isn't ready for real-world use. The researchers used volunteers in relatively ideal conditions (open ground, generous spacing and dramatic movements). The AI is 94 percent effective at its best, but that drops down to an unacceptable 79 percent when there are ten people in the scene. As-is, this system might struggle to find an assailant on a jam-packed street -- what if it mistakes an innocent gesture for an attack? The creators expect to fly their drone system over two festivals in India as a test, but it's not something you'd want to rely on just yet. There's a larger problem surrounding the ethical implications. There are questions about abuses of power and reliability for facial recognition systems. Governments may be tempted to use this as an excuse to record aerial footage of people in public spaces, and could track the gestures of political dissidents (say, people holding protest signs or flashing peace symbols). It could easily combine with other surveillance methods to create a complete picture of a person's movements. This might only find acceptance in limited scenarios where organizations both make it clear that people are on camera and with reassurances that a handshake won't lead to police at their door. Content gathered by BTM robotics training center, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP nagar, robotics training centers in Bannerghatta road, robotics training centers in JP nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore.
Look, up in the sky! It's Disney's new autonomous acrobatic robot. Disney's animatronics are coming a long way from drunken pirates waving flagons of ale or hippos that wiggle their ears. In the (relatively) near future, robotic versions of Iron Man or Buzz Lightyear could be performing autonomous acrobatics overhead in Disney theme parks, thanks to the newly-unveiled Stuntronics robot. Animatronic characters have populated Disney parks for more than half a century, albeit often just looping a specific movement over and over. In recent years Disney Research has tried to make the robots more agile and interactive, developing versions that can grab objects more naturally and even juggle and play catch with visitors. Back in May, the company unveiled a prototype called Stickman. Basically a mechanical stick with two degrees of freedom, the robot could be flicked into the air like a trapeze artist, where it used a suite of sensors to tuck and roll in midair, perform a couple of backflips, and unfurl for landing. Impressive as that is, Stickman was far more stick than man. In just a few short months, the project has evolved into Stuntronics, a robot that's noticeably more human. Designed to be a kind of robotic stunt double for a human actor, the Stuntronics robot can perform the same kind of autonomous aerial stunts thanks to a similar load of sensors as Stickman, including an accelerometer, gyroscope array and laser range finding. But unlike Stickman, Stuntronics can stick its landing too. The former bot tended to land flat on its back, but the new version can land feet-first, and hit what looks like a smaller target. Not only that, it can strike a heroic pose in the air, before tucking back up ready for landing. Disney Research scientists said that during a stage show or ride, other animatronics or human actors could perform the up-close, static scenes before the Stuntronics robot is wheeled out when the character needs to fly (or fall with style). Of course, there's no guarantee that this kind of thing will ever get off the ground (literally or figuratively), but it's always exciting to peek behind the curtain at Disneyland. Content gathered by BTM robotics training center, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP nagar, robotics training centers in Bannerghatta road, robotics training centers in JP nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore,
SAY HI TO CIMON, THE FIRST AI-POWERED ROBOT TO FLY IN SPACE. When you thought that Artificial Intelligence (AI) is redefining life on Earth, think again! Meet CIMON, the first AI-powered robot who was launched into space from Florida on Friday, June 29th to join the crew and assist astronauts of the International Space Station (ISS). CIMON was launched by a SpaceX rocket carrying food and supplies for the crew aboard the International Space Station. At CIMON’s pre-launch news conference, Kirk Shireman, NASA’s International Space Station (ISS) program manager, addressed that the knowledge base and ability to tap into AI in a way that is useful for the task that is done is really critical for having humans further and further away from the planet. CIMON or (Crew Interactive Mobile Companion) is programmed to answer voice commands in English. The AI-powered robot is roughly the size of a volleyball and weighs 5 kilograms. CIMON will float through the zero-gravity environment of the space station to research a database of information about the ISS. In addition to the mechanical tasks assigned, the AI-powered CIMON can even assess the moods of its human crewmates at the ISS and interact accordingly with them. An Intelligent Astronaut CIMON is the brainchild of the European aerospace company Airbus. With the artificial intelligence inside powered by IBM, AI-Powered CIMON was initially built for the German space agency. Alexander Gerst, a German astronaut currently aboard the ISS, assisted with the design of CIMON’s screen prompts and vocal controls. As per the mission description written by Airbus representatives, CIMON’s mission calls for the AI-Powered astronaut robot to work with Gerst on three separate investigations. Cimon’s tasks at ISS include experimenting with crystals, working together with Gerst to solve the Rubik’s cube and performing a complex medical experiment using itself as an ‘intelligent’ flying camera. CIMON can interact with anyone at ISS; the AI-powered robot will nod when any command is spoken in English. However, CIMON is programmed to specifically help Gerst during its first stay on the ISS. Alexander Gerst can make CIMON work by speaking commands in English like, ‘CIMON, could you please help me perform a certain experiment? or could you please help me with the procedure?'” In response, CIMON will fly towards Alexander Gerst, to start the communication. An Interactive Step Forward CIMON knows whom it is talking to through its inbuilt facial-recognition software. If you thought that CIMON would look like a mechanical robot, you are wrong. CIMON has a face of its own, a white screen with a smiley face. The astronaut AI assistant will be able to float around, by sucking air in and expelling it out through its special tubes once it is aboard the ISS. CIMON’s mission to space demonstrates researchers, the collaboration of humans and AI-powered technology for further explorations. However, it will be a long way before intelligent robots are ready to undertake principal tasks in the final frontier including helping astronauts repair damaged spacecraft systems or treating sick crewmembers. But a beginning has been made with CIMON and that day will probably be a reality soon. In its first space mission, CIMON will stay in space for a few months and is scheduled to return to earth in December. Post its return, scientists will study and assess its abilities for future implementations. With the launch of CIMON, a lifelong space-exploration association between humans and machine may have just begun. Content gathered by BTM robotics training centre, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP Nagar, robotics training centres in Bannerghatta road, robotics training centres in JP Nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore.
'Blind' Cheetah 3 robot can climb stairs littered with obstacles. MIT's Cheetah 3 robot can now leap and gallop across rough terrain, climb a staircase littered with debris, and quickly recover its balance when suddenly yanked or shoved, all while essentially blind. The 90-pound mechanical beast -- about the size of a full-grown Labrador -- is intentionally designed to do all this without relying on cameras or any external environmental sensors. Instead, it nimbly "feels" its way through its surroundings in a way that engineers describe as "blind locomotion, " much like making one's way across a pitch-black room. "There are many unexpected behaviours the robot should be able to handle without relying too much on vision, " says the robot's designer, Sangbae Kim, associate professor of mechanical engineering at MIT. "Vision can be noisy, slightly inaccurate, and sometimes not available, and if you rely too much on vision, your robot has to be very accurate in position and eventually will be slow. So we want the robot to rely more on tactile information. That way, it can handle unexpected obstacles while moving fast." Researchers will present the robot's vision-free capabilities in October at the International Conference on Intelligent Robots, in Madrid. In addition to blind locomotion, the team will demonstrate the robot's improved hardware, including an expanded range of motion compared to its predecessor Cheetah 2, that allows the robot to stretch backwards and forwards, and twist from side to side, much like a cat limbering up to pounce. Within the next few years, Kim envisions the robot carrying out tasks that would otherwise be too dangerous or inaccessible for humans to take on. "Cheetah 3 is designed to do versatile tasks such as power plant inspection, which involves various terrain conditions including stairs, curbs, and obstacles on the ground, " Kim says. "I think there are countless occasions where we [would] want to send robots to do simple tasks instead of humans. Dangerous, dirty, and difficult work can be done much more safely through remotely controlled robots." Making a commitment The Cheetah 3 can blindly make its way up staircases and through unstructured terrain, and can quickly recover its balance in the face of unexpected forces, thanks to two new algorithms developed by Kim's team: a contact detection algorithm, and a model-predictive control algorithm. The contact detection algorithm helps the robot determine the best time for a given leg to switch from swinging in the air to stepping on the ground. For example, if the robot steps on a light twig versus a hard, heavy rock, how it reacts -- and whether it continues to carry through with a step, or pulls back and swings its leg instead -- can make or break its balance. "When it comes to switching from the air to the ground, the switching has to be very well-done, " Kim says. "This algorithm is really about, 'When is a safe time to commit my footstep?'" The contact detection algorithm helps the robot determine the best time to transition a leg between swing and step, by constantly calculating for each leg three probabilities: the probability of a leg making contact with the ground, the probability of the force generated once the leg hits the ground, and the probability that the leg will be in midswing. The algorithm calculates these probabilities based on data from gyroscopes, accelerometers, and joint positions of the legs, which record the leg's angle and height with respect to the ground. If, for example, the robot unexpectedly steps on a wooden block, its body will suddenly tilt, shifting the angle and height of the robot. That data will immediately feed into calculating the three probabilities for each leg, which the algorithm will combine to estimate whether each leg should commit to pushing down on the ground, or lift up and swing away in order to keep its balance -- all while the robot is virtually blind. "If humans close our eyes and make a step, we have a mental model for where the ground might be, and can prepare for it. But we also rely on the feel of touch of the ground, " Kim says. "We are sort of doing the same thing by combining multiple [sources of] information to determine the transition time." The researchers tested the algorithm in experiments with the Cheetah 3 trotting on a laboratory treadmill and climbing on a staircase. Both surfaces were littered with random objects such as wooden blocks and rolls of tape. "It doesn't know the height of each step and doesn't know there are obstacles on the stairs, but it just ploughs through without losing its balance, " Kim says. "Without that algorithm, the robot was very unstable and fell easily." Future force The robot's blind locomotion was also partly due to the model-predictive control algorithm, which predicts how much force a given leg should apply once it has committed to a step. "The contact detection algorithm will tell you, 'this is the time to apply forces on the ground, '" Kim says. "But once you're on the ground, now you need to calculate what kind of forces to apply so you can move the body in the right way." The model-predictive control algorithm calculates the multiplicative positions of the robot's body and legs a half-second into the future if a certain force is applied by any given leg as it makes contact with the ground. "Say someone kicks the robot sideways, " Kim says. "When the foot is already on the ground, the algorithm decides, 'How should I specify the forces on the foot? Because I have an undesirable velocity on the left, so I want to apply a force in the opposite direction to kill that velocity. If I apply 100 newtons’s in this opposite direction, what will happen a half second later?" The algorithm is designed to make these calculations for each leg every 50 milliseconds, or 20 times per second. In experiments, researchers introduced unexpected forces by kicking and shoving the robot as it trotted on a treadmill, and yanking it by the leash as it climbed up an obstacle-laden staircase. They found that the model-predictive algorithm enabled the robot to quickly produce counter-forces to regain its balance and keep moving forward, without tipping too far in the opposite direction. "It's thanks to that predictive control that can apply the right forces on the ground, combined with this contact transition algorithm that makes each contact very quick and secure, " Kim says. The team had already added cameras to the robot to give it visual feedback of its surroundings. This will help in mapping the general environment and will give the robot a visual heads-up on larger obstacles such as doors and walls. But for now, the team is working to further improve the robot's blind locomotion "We want a very good controller without vision first, " Kim says. "And when we do add vision, even if it might give you the wrong information, the leg should be able to handle (obstacles). Because what if it steps on something that a camera can't see? What will it do? That's where blind locomotion can help. We don't want to trust our vision too much." This research was supported, in part, by Naver, Toyota Research Institute, Foxconn, and Air Force Office of Scientific Research. Content gathered by BTM robotics training center, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP nagar, robotics training centers in Bannerghatta road, robotics training centers in JP nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore,
Popcorn-Driven Robotic Actuators Popcorn is a cheap, biodegradable way to actuate a robot (once) People toss around the word “novel” fairly often in robotics papers, but this right here is the definition of a novel mechanism, and it might be one of the most creative ideas I’ve seen presented at a robotics conference in a long time. This is not to say that popcorn is going to completely transform robotic actuation or anything, but it’s weird enough that it might plausibly end up in some useful (if very specific) robotic applications. Why use popcorn to power an actuator? You can think of unpopped kernels of popcorn as little nuggets of stored mechanical energy, and that energy can be unleashed and transformed into force and motion when the kernel is heated. This is a very useful property, even if it’s something that you can only do once, and the fact that popcorn is super cheap and not only biodegradable but also edible are just bonuses. The “pop” in popcorn happens when enough heat is applied to vaporize the moisture inside the kernel. Over 900 kPa of internal pressure causes the yummy goo inside of the kernel to explode out through the shell, expand, and then dry. Relative to the size of the original kernel, the volume of a popped piece of popcorn has increased by a factor of at least five, although it can be much more, depending on the way the kernel was heated. Because of this variability, the first step in this research was to properly characterize the popcorn, and to do this the researchers, from Cornell’s Collective Embodied Intelligence Lab, picked up some Amish Country brand popcorn (chosen for lack of additives or postharvest treatment) in white, medium yellow, and extra small white. They heated each type using hot oil, hot air, microwaves, and direct heating with a nichrome resistance wire. The extra small white kernels, which were the cheapest at the US $4.80 per kilogram, also averaged the highest expansion ratio, exploding to 15.7 times their original size when popped in a microwave. Here’s what the researchers suggest that popcorn might be useful for in a robotics context: • Jamming actuator. “Jamming” actuators are compliant actuators full of a granular fluid (coffee grounds, for example) that will bind against itself and turn rigid when compressed, most often by applying a vacuum. If you use popcorn kernels as your granular fluid, popping them will turn the actuator rigid. It’s irreversible but effective: In one experiment, the researchers were able to use a jamming actuator filled with 36 kernels of popcorn to lift a 100-gram weight as it popped. • Elastomer actuator. An elastomer actuator is a hollow tube made out of an elastic material that’s constrained in one direction, such that if the tube is expanded, it will bend. Usually, these soft actuators are inflated with air, but you can do it with popcorn, too, and the researchers were able to use a trio of these actuators to make a sort of three-fingered hand that could grip a ball. • Origami actuator. Like elastomer actuators, origami actuators are constrained in one dimension to curling as they expand, but the origami structure allows this constraint to be built into the structure of the actuator as it’s folded. The researchers used recycled Newman’s Own Organic Popcorn bags to make their origami actuators, and 80 grams of popped kernels were able to hold up a 4 kg kettlebell. • Rigid-link gripper. Popcorn can be used indirectly as a power source by putting un-popped kernels in a flexible container in between two plates with wires attached to them. As the popcorn pops, the plates are forced apart, pulling on the wires. This can be used to actuate whatever you want, including a gripper. It’s certainly true that you could do most of these things completely reversibly by using air instead of popcorn. But, using air involves a bunch of other complicated hardware, while the popcorn only needs to be heated to work. Popcorn is also much easier to integrate into robots that are intended to be biodegradable (DARPA has been working on this), and it’s quite cheap. It’s probably best not to compare popcorn actuators directly to other types of robotic actuators, but rather to imagine situations in which a cheap or disposable robot would need a reliable single-use actuator, to open or deploy something. Content gathered by BTM robotics training centre, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP Nagar, robotics training centres in Bannerghatta road, robotics training centres in JP Nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore.
Sprawling Wheel Leg Robot Crawls and Climbs. The latest version of this skittery little sprawling robot can crawl like a turtle. We’re always impressed by the way David Zarrouk (a professor at Ben-Gurion University of the Negev by way of UC Berkeley’s Biomimetic Mill systems Lab) manages to extract a ton of functionality from the absolute minimum of hardware in his robots. In the past, we’ve seen clever designs like a steerable robot that only uses a single motor and a multi-jointed robot arm that uses a travelling motor to actuate all of its degrees of freedom. At the 2018 IEEE International Conference on Robotics and Automation (ICRA) in Brisbane, Zarrouk presented an update to STAR, the Sprawl-Tuned Autonomous Robot that we first wrote about in 2013. Called Rising STAR, or RSTAR, it takes the sprawling wheel-leg mobility and adds another degree of freedom that allows the body of the robot to move separately from the legs, changing its centre of mass to help it climb over obstacles. RSTAR is the latest in Zarrouk’s series of sprawling robots, designed to handle all kinds of terrain obstacles while minimizing the cost of transport. “Sprawl” in this context refers to the robot’s legs, which are angled (adjustably) downwards and outwards from the body. RSTAR has an added degree of freedom in that its body is able to change its location relative to the legs, altering the robot’s centre of mass. It seems like a simple change, but it enables a bunch of new behaviours—not only can the robot climb over larger obstacles without flipping over, but it can also climb vertically up closely spaced walls and “crawl” through narrow gaps by adopting a legged walking gait. While the adjustable centre of mass helps keep the robot more stable, as the video shows flipping over can actually be useful, since it enables the robot to switch between faster and more efficient round wheels and more capable spoke wheels (whegs). RSTAR’s top speed is about 1 m/s on hard flat surfaces, although its turtle gait means that it can handle extremely soft or granular surfaces (like thick mud or sand) without getting stuck. Content gathered by BTM robotics training centre, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP Nagar, robotics training centres in Bannerghatta road, robotics training centres in JP Nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore,
Demand for Artificial intelligence & robotics experts to be higher by 50-60% in 2018 Artificial intelligence (AI) is the buzz in the jobs bazaar as machine learning and the Internet of Things (IoT) increasingly influence business strategies and analytics. Human resource and search experts estimate a 50-60% higher demand for AI and robotics professionals in 2018 even as machines take over repetitive manual work. “Machines are taking over repetitive tasks. Robotics, AI, big data, and analytics will be competencies that will be in great demand, ” said Shakun Khanna, senior director at Oracle for the Asia-Pacific region. Organizations are being pushed to become even more efficient as jobs turn predictable, said Rishabh Kaul, co-founder of recruitment startup Belong, which helps clients search for and hire AI professionals. “There is a significant increase in the adoption of AI and automation across enterprises, leading to a skyrocketing of demand for professionals in these fields, ” he said. Jobs in the IoT ecosystem have grown fourfold in the last three years, according to estimates by Belong. These are related to engagement technologies and data capture among other areas. Demand for professionals in the realm of data analysis, including data scientists, has grown by almost 76% in the past few years in AI. The demand is at the entry level as well as middle to senior ranks across sectors such as business, financial services and insurance (BFSI), e-commerce, startups, business process outsourcing (BPO), information technology (IT), pharmaceuticals, healthcare, and retail. “Robotics is required by process-oriented companies for a better customer experience. It helps in cutting down cost and improves efficiency, ” said Thammaiah BN, managing director, Kelly Services India. “AI is helping companies to be in spaces so far not thought of. Organizations can accomplish new things, new products, and services through AI.” Companies want to mine the data they have accumulated over the years, said Sinosh Panicker, partner, Hunt Partners. “AI helps them predict and position their products better and push out new things, ” he said. However, there’s an acute demand-supply mismatch for AI talent across industries, experts said. Candidates for AI roles related to natural language processing (NLP), deep learning, and machine learning are thin on the ground, according to the Belong Talent Supply Index. The ratio of the number of people to jobs in deep learning is 0.53, while for machine learning it’s 0.63 and for NLP it’s 0.71. Only 4% of AI professionals in India have worked on cutting-edge technologies such as deep learning and neural networks, the key ingredients in building advanced AI-related solutions, said Kaul. A few academic institutions such as the Indian Institutes of Technology (IITs) in Kharagpur and Kanpur, the Indian Institute of Information Technology (IIIT) in Hyderabad and the Indian Institute of Science (IISc) in Bengaluru have specialized disciplines or centres for artificial intelligence and machine learning. “In fact, according to our internal research, less than 2% of professionals who call themselves data scientists or data engineers have a PhD in AI-related technologies, ” said Kaul. Such is the need for talent that it is prompting top business schools, including the Indian Institutes of Management (IIMs), to include AI and machine learning in their curriculum and expose students to the full ecosystem of IoT. The IIMs in Bangalore and Kozhikode and premier B-Schools like the SP Jain Institute of Management & Research (SPJIMR) are offering courses on AI, robotics, and IoT that can be connected to business strategy to enhance performance, output and customer experience. Some are learning skills through various other courses, including online ones. “People who are keeping themselves abreast with new age technologies and have the right set of required skills under the same are in high demand, ” said ABC Consultants director Ratna Gupta. Content gathered by BTM robotics training centre, robotics in Bangalore, stem education in Bangalore, stem education in Bannerghatta road, stem education in JP Nagar, robotics training centres in Bannerghatta road, robotics training centres in JP Nagar, robotics training for kids, robotics training for beginners, best robotics in Bangalore,
1
false