Sunday, March 1, 2009

Thinking About the Future: An Introduction to Forecasting



From the third chapter of "Thinking about the Future," a book co-edited by Social Technologies' Andy Hines and futurist Peter Bishop, comes this Introduction to Forecasting.
Forecasting involves creating alternative futures. Most organizations, if not challenged, tend to believe the future is going to be pretty much like the past. If the analyst probes at an organization's view of the future, he or she will find an array of unexamined assumptions, which tend to converge around incremental changes--a.k.a. the "official future" or "baseline forecast"--that pretty much preserves the current paradigm or way of doing things. A key task for the analyst, therefore, is to challenge this view and prod the organization to take seriously the possibility that things may not continue as they have--in practice, they rarely do!
In essence, forecasting involves generating the widest range of creative possibilities, then consolidating and prioritizing the most useful for the organization to actively consider or prepare for as it moves forward.
A principal means of challenging the official future is to develop alternative futures. A key tenet of strategic foresight is that the future is inherently unknowable and efforts to get it exactly right are futile. What the analyst can offer is to expand the range and depth of possibilities for the organization to consider, thereby reducing the likelihood and magnitude of surprise. This in turn enables the organization to successfully navigate through what does emerge. The analyst may be asked to produce the "correct" future, that is, to predict what will happen. Organizations prefer the clarity of dealing with a single possibility rather than the messiness of alternatives. In response, the analyst must get across the idea that single-point forecasting is doomed and the organization will be better served by understanding and preparing for a range of possibilities.
Forecasting alternative futures does not mean developing detailed plans for every contingency, however. Rather it means monitoring the external environment for leading indicators--signs or guideposts that suggest events are heading towards one or more of the alternatives.
The guidelines in this section suggest how the organization can produce a useful set of alternative futures. A useful forecast challenges existing assumptions about the future, gets the organization to consider "what if," and thereby motivates it to plan and act differently. An alternative future that turns out to be off-base can still be useful if it has prompted the organization to take the future seriously, to consider and prepare for possibilities in a way that yields helpful learning and experience.
The preparatory work done in Framing begins to pay dividends here, incidentally. If Framing has helped the team function well, produced a work environment that is conducive to creativity, and so forth, the work done in Forecasting will be that much better.

Germany's CESAR crowned king of rovers in ESA’s Robotics Challenge


A robot rover designed by a Bremen university team has won an ESA contest to retrieve soil samples from a lunar-style terrestrial crater. Eight student teams fielded rovers during the event, their progress monitored by an advanced 3-D viewer already flight-tested in space and planned for eventual deployment on the Moon.
Craters surrounding the Moon's poles are a top 21st Century science target. Lunar researchers believe these craters may be 'cold traps', preserving ancient water ice deposits. Such ice would not only be an invaluable time capsule, it would also support manned lunar settlements. But the only way to verify the ice is there is to go fetch it, which is where rovers come in.
The bleak pumice landscape of Minas de San José within Tenerife's Teide National Park stood in for the Moon during the inaugural ESA Lunar Robotics Challenge (LRC). Built within strict size, weight and power constraints, the rovers had to descend down the steep 40° slopes of a 15-metre deep crater, grab 0.1 kg of specifically selected soil then carry it out again – all the while in darkness.
Working from a trailer camp 2000 metres up, each five-strong team was confronted with some distinctly non-lunar weather including heavy rain and clouds. In the event only one rover managed to complete the assignment –Bremen's three-wheeled CESAR (Crater Exploration and Sample Return) robot, duly judged LRC winner on 26 October.
The rovers were not the only hardware undergoing field testing. "The location was an excellent place to exercise the capabilities of the Erasmus Recording Binocular (ERB)," explained Massimo Sabbatini of ESA's Human Spaceflight Directorate. "This innovative high-resolution stereo-video recorder will become part of the standard toolset of astronauts when surveying lunar areas."
Despite being more than 2000 km away from the European mainland the test site was kept linked with the internet by a satellite ground station provided by ESA's Telecom Directorate, which enabled remote team members to follow the event and even debug rover software as needed.
Francesco Feliciani of ESA’s Telecom Directorate said: "The successful live transmission of the videos streams from the LRC proves that the technology is ready to be used in real life applications.”
The broadband-quality transmission was made possible by the ESA co-funded AmerHis regenerative payload hosted aboard Hispatsat’s Amazonas spacecraft. AmerHis works as an internet router in the sky.
Stefano Badessi also of ESA Telecom commented: “Our satellite terminal allowed those team members who could not be on the Teide in person to watch how their teammates were faring.”
Jury member Richard Fisackerly paid particular attention to the control stations teams employed to communicate with their rovers. As part of ESA's Aurora Programme developing technology for manned and robotic exploration of the Moon and Mars, Fisasckerly plans to apply the student experience into terminals planned for lunar astronauts to interact with robot assistants.
Adverse weather conditions affected preliminary trials and also meant two of the rovers – fielded by the University of Pisa and Scuola Superiore Sant'Anna – were forced to perform the Challenge in daytime rather than at night.
This being so, the jury decided against formally ranking the teams but did compile a scoreboard to give feedback on strategic choices. LRC judge and organiser Gianfranco Visentin said: "We hope that this information will increase the wisdom of the teams for future competitions."
Also affecting scoreboard figures, the University of Surrey's SELENE proved unable to take part due to a mechanical failure while Scuola Superiore Sant'Anna's pESApod was shorted out by rainfall.
Lucio Scolamiero, member of the jury and co-proposer of the LRC, declared “Only six months ago most of the rovers were only at drawing board level. What the student teams have done in such a short time has been a challenge in itself, they all deserve to be considered winners of this challenge”.
The eight rover teams shared their experiences on 14 November at a special session concluding ESA's 10th Workshop on Advanced Space Technologies for Robotics and Automation.

GERMAY'S ROBOTS




Friendly robots at Germany's K2007 exhibition
24 September 2007
var object = SHARETHIS.addEntry({title:'share',summary: 'Sharing is good for the soul.'},{button:false});
document.write('');
var element = document.getElementById("share");
object.attachButton(element);

Visitors to the K2007 event in Dusseldorf will be able to shake hands with one of ABB's IRB140 robots.The event will take place on 24-31 October 2007, and will give visitors literally a first hand inspection of the multitasking capabilities of robotic automation in the plastics industry. They will be able to lead an IRB140 robot by the hand and teach it any moves they like. This lead by hand technique lets the user hold the robot gripper while a command de-energizes the robot and it goes limp. The user then moves the robot by hand and simply demonstrates the required positions for a machine cycle or for part processing. In the case of part processing, such as cutting or polishing, the program afterwards re-runs the robot along the taught positions and automatically adds all necessary positions to generate the perfect processing path. The IRB140 robot is a compact and powerful 6-axis machine that combines fast acceleration, large working area and high payload. As an adjunct to user convenience, ABB introduces a unique new service for monitoring and fault finding in robot applications, whereby diagnostics and even predictive machine condition monitoring can be effected over the Internet. The robot controllers communicate wirelessly via an embedded web server to alert of service requirements – maximising uptime, preventing unplanned stoppages and minimising maintenance costs.

Robot Versus Sniper


Securing an intersection is a basic combat task that the Pentagon hopes will one day be tackled by unattended robots. In the scenario below, set in 2020, an armed robot has forged ahead of a squad (not shown) to determine if a sniper is stationed at a key corner. As simple as the mission might seem, it’s a huge engineering challenge to program the skills needed for the assignment into a robot’s brain. Experts say success will require integrated sensors to double for human sensory organs and powerful processing of the data to mimic human training—and instinct. “What is intuition?” asks Jon Bornstein, head of the Army Research Lab’s robotics office. “A series of cues that give a high probability of something occurring.”

The MULE



This prototype of the Gladiator, built by Carnegie Mellon, is equipped with smoke generators and grenade launchers. Marine Corps officials say they will field-test the latest, 1-ton version this year. (Photograph by Craig Cameron Olsen)
Despite the challenges, armed UGV development is on the rise. Foster-Miller is currently working on a successor to the SWORDS, a larger and more versatile robot called the MAARS (Modular Advanced Armed Robotic System). Technicians in the field will be able to replace the system’s M240 machine gun (the same kind currently planned for the MULE) with an arm or trade the tracks for wheels. However, the MAARS requires a human operator to move and acquire targets. IRobot, the maker of thousands of bomb-defusing PackBots, plans to introduce its Warrior X700 this year. The Warrior is larger than the PackBot and has a similar set of articulated tracks that allows it to climb stairs, and a 150-pound carrying capacity. The company is touting the Warrior’s ability to fight fires, haul wounded and serve as a weapons platform. But according to Joe Dyer, the president of iRobot Government & Industrial Robots division, a key benefit of an armed UGV isn’t what it can dish out, but what it can take: “A robot can shoot second.” The Warrior is able to follow GPS waypoints, can breach ditches and navigate cramped conditions on its own, but it will still rely heavily on human guidance in a fight. Where weapons are involved, Dyer says, “Autonomy’s going to come into robots on little cat’s feet.” Like their bomb-poking forebears, weaponized bots are disposable, making them particularly useful in urban warfare, with its high potential for collateral damage and sudden, point-blank firefights. “Robots are fearless, so there’s an opportunity to better assess the situation,” Dyer says. “That means less risk to noncombatants and to friendly forces.” In urban warfare, where troops often lose the high-tech edge, an armed ground robot is the perfect point man. “Send a robotic platform into a room, and it might take some small arms fire,” Shoop says. “But it can be repaired fairly easily. A soldier or Marine is not as easily repaired.” The MULE is toying with my emotions. After running through its full range of articulated positions—a hilarious diagnostic dance routine that has it pivoting, rising and tipping its wheels off the road—the robot is now ramming a car. The sedan offers little resistance, sliding across the asphalt. Like proud owners watching their pit bull tear through a chew toy, the small crowd of defense contractors and engineers are chuckling. Next, the MULE climbs onto a 5-ft.-high platform and prepares to cross a 6-ft. gap. The robot reels back on its hind legs. It inches forward and falls across the space, its front wheels slamming onto the next platform. Although it was moved into position by a human operator, the robot’s terrain-clearing performance was automatic, using internal sensors that track wheel position and speed and two onboard Pentium III processors cycling an array of mobility algorithms. Despite being blind, the MULE is already surprisingly autonomous. The exact details haven’t been worked out, but the goal is for a single sergeant to handle multiple robots. But no matter how sophisticated the robot, Lockheed officials point out, it will never fire without a command from a human operator. Having a person decide when to shoot is a recurring theme in discussions about armed robots. Maj. David Byers, the assistant program manager for the MULE, compares the likelihood of the robot’s weapons discharging accidentally to a modern tank inexplicably firing off a round. Using the UGV’s sensors, a human will confirm that each target is a hostile before firing. “Armed robots are still foreign to Army culture,” he says. “We need to cultivate the understanding that they are quite safe.” The demonstration is winding down, and after a slew of caveats and reassurances, it’s my turn to drive. On a grassy slope overlooking the track, an engineer hands me the Xbox 360 controller. I will not, I’m told, get to wear the shiny Rocketeer backpack. The game controller is surprisingly standard-issue—no external tweaks or mods. When I hit a button, the prototype rumbles forward. I jam the thumbstick to one side, and the robot turns in place, its fresh wheel screeching, painting a perfect circle on the asphalt. I guide the MULE through a small parking lot, around cars, and across a muddy patch to give the tires a little more traction. The robot is responsive, literally leaning into turns and braking with finesse. My fingers keep hitting the unused buttons, automatically probing for the one that opens fire. In the sci-fi cult classic The Last Starfighter, the teen hero is drafted into a galactic war. The arcade game he spent hours mastering was really an alien simulator, and with a quick costume change, he’s reborn as an ace pilot. For 10 minutes, my fantasy is much better: Years of Saturday afternoons and missed classes and so-called sick days spent clicking away at a game console—it wasn’t wasted time; it was training. I have become a crack military robot pilot. Time’s up, and I hand back the controller, the prototype still rumbling away, slightly muddier than I found it. We head down the hill, and as I pass an engineer, I mention how easy it is to drive. “Yeah, we based the controls on Project Gotham Racing,” he says. It’s a joke, but the quip offers a glimpse of what future warfare might look like—robotic, autonomous and just a little bit chilling.

America's Robot Army: Are Unmanned Fighters Ready for Combat?
















At a muddy test track in Grand Prairie, Texas, 13 miles west of Dallas, the robot is winning. It has climbed on top of a sedan, its 2.5-ton bulk propped on the crumpled roof. The car never stood a chance.
digg_url = 'http://digg.com/tech_news/Meet_America_s_new_armed_robots';
The MULE (Multifunction Utility/Logistics and Equipment) is roughly the size of a Humvee, but it has a trick worthy of monster truck rallies. Each of its six wheels is mounted on an articulated leg, allowing the robot to clamber up obstacles that other cars would simply bump against. Right now, it’s slowly extricating itself from the caved-in roof, undulating slightly as it settles into a neutral stance on the asphalt. This prototype’s movements are precise, menacing and slow. When the final product rolls onto the battlefield in six years, it will clear obstacles in stride, advancing without hesitation. And, like the robot cars that raced through city streets in last fall’s Pentagon-funded DARPA Urban Challenge, the MULE will use sensors and GPS coordinates to pick its way through a battlefield. If a target is detected, the machine will calculate its own firing solutions and wait for a remote human operator to pull the trigger. The age of killer robots is upon us. But here at defense contractor Lockheed Martin’s test track, during a demonstration for Popular Mechanics, this futuristic forerunner of the robot army has a flat tire. “Actually, this is good,” says Michael Norman, Lockheed’s project manager for the prototype. “You’ll be able to see how quick it is to swap in a new tire.” He nods toward an engineer holding an Xbox 360 controller and wearing a gigantic, gleaming backpack that contains a processing computer. The engineer taps a handheld touchscreen. One of the robot’s wheeled legs rotates upward, and a two-man crew goes to work. Each leg has its own hub motor to allow for a variety of ­positions. If one leg is blown off by enemy fire or a roadside bomb, the rest are able to soldier on, with the robot automatically adjusting its center of gravity to stay mobile. It’s highly functional. But with its engine powered down—it runs on a Mercedes-built engine originally modified for unmanned aerial vehicles (UAVs)—and one leg cocked gamely in the air, the MULE doesn’t look so tough right now. In fact, the MULE isn’t ready for battle. Barely a year old, the prototype is a product of the Army’s Unmanned Ground Vehicle program, which began in 2001. It has yet to fire a single bullet or missile, or even be fitted with a weapon. Here at the test track it’s loaded down with rucksacks and boxes, two squads’ worth of equipment. At the moment, the MULE has no external sensors. “We’re 80 percent through the initial phase,” Norman says, “but we don’t have the perception fully tested. It knows heading and speed, but it’s blind.” In other words, it’s essentially one of the world’s biggest radio-control cars. And, eyeing the robot’s familiar controller, I realize I might have a shot at driving it. I know my way around a video-game console, but the engineers are noncommittal about my request to drive the MULE. The goal, of course, is for the MULE to drive itself. Sitting a short distance away is the prototype’s future, a full-size mockup of a weaponized variant, its forward-facing machine gun bracketed by missile tubes. The gleaming sphere set on a short mast looks precisely like a robot’s eyeball. It will visually track moving targets, allowing operators to zoom in for a closer look before pulling the trigger. According to the Army, this giant prop represents a revolutionary shift in how we will wage wars. This is the face of the robotic infantry. Unmanned ground vehicles (UGVs) have already flooded the battlefield. There are at least 6000 robots in use by the Army and Marine Corps in Iraq and Afghanistan. For years these small, remote-control vehicles have allowed troops to peek around corners and investigate suspected bombs. And while unmanned aerial vehicles have been loaded with missiles since 2001, the arming of ground robots is relatively uncharted territory. Last June the Army deployed the first-ever armed UGVs. Three SWORDS (Special Weapons Observation Remote Direct-Action System) robots landed in Iraq, each equipped with an M249 light machine gun. These UGVs are essentially guns on tracks, a variant of the remote-control Talon bots routinely blown up while investigating improvised explosive devices. When the trio was approved for combat duty, the potential for historic robot-versus-human carnage lit up the blogosphere. Never mind the dozens of air-to-ground Hellfire missiles that have already been launched by a squadron of armed Predator drones over the past seven years—this was a robot soldier, packing the same machine gun used by ground troops. The historic deployment ended with a whimper after the Army announced that the SWORDS would not be given the chance to see combat. According to a statement from Duane Gotvald, deputy project manager of the Robotic Systems Joint Project Office, which oversees robots used by the Army and Marines, “While there has been considerable interest in fielding the system, some technical issues still remain and SWORDS is not currently funded.” The robots never fired a shot, but Gotvald pointed out that the Army’s 3rd Infantry Division used them for surveillance and “peacekeeping/guard operations.” The nature of the robots’ “technical issues” remains an open question. The Army has not released details, and officials with Foster-Miller, the Massachusetts-based contractor that developed the SWORDS, refused interview requests for this story. But according to Col. Barry Shoop, deputy head of West Point’s electrical engineering and computer science department, the reason armed UGVs continue to lag behind UAVs is because of their mission: close-quarters firefights. “The technical challenges are greater,” Shoop says. “Think of the kind of image and graphics processing you need to make positive identification, to use lethal force. That’s inhibiting.”

AMMO IN ROBOT






THE SILVER BULLET:



It is shocking, but will it happen? The project has its critics, even in the Pentagon, where many doubt that technology can deliver such a "silver bullet". But the doubters are not in the ascendant, and it would be folly, against the background of the Iraq disaster and the hyper-militarised stance of the Bush administration, to write it off as a computer gamer's daydream.
One reason Washington finds it so attractive is that it fits closely with the ideologies of permanent war that underpin the "war on terror". What better in that war than an army of robot warriors, permanently cruising those parts of the globe deemed to be "supporting terrorism"? And what a boon if they destroy "targets" all on their own, with not a single US soldier at risk. Even more seductively, this could all take place out of sight of the capricious western media.
These technologies further blur the line between war and entertainment. Already, games featuring urban warfare in digitised Arab cities are everyday suburban entertainment - some are produced by the US forces themselves, while a firm called Kuma Reality offers games refreshed weekly to allow players to simulate participation in fighting in Iraq almost as it is happening in the real world.
Creepy as this is, it can be worse: those involved in real warfare may have difficulty remembering they are not playing games. "At the end of the work day," one Florida-based Predator operator reflected to USA Today in 2003, "you walk back into the rest of life in America." Will such people always remember that their "work day", lived among like-minded colleagues in front of screens, involves real death on the far side of the world? As if to strengthen the link with entertainment, one emerging military robot, the Dragon Runner, comes with a gamer's control panel. Greg Heines, who runs the project, confesses: "We modelled the controller after the Play Station 2 because that's what these 18-, 19-year-old marines have been playing with pretty much all of their lives."
The US aspiration to be able to kill without human involvement and with minimum risk raises some dreadful questions. Who will decide what data can be relied on to identify a "target"? Who will be accountable when there is an atrocity? And what does this say about western perceptions of the worth and rights of the people whose cities are no more than killing fields, and who themselves are mere "targets" to be detected, tracked and even killed by machines?
Finally, the whole process feeds alarmingly into the "homeland security" drive in the cities of the global north. The same companies and universities are supplying ideas to both, and the surveillance, tracking and targeting technologies involved are closely related. What we are seeing is a militarisation of urban life in both north and south that helps perpetuate the biggest and most dangerous myth of all, which is that technical and military solutions can somehow magic away resistance to George W Bush's geopolitical project

ROBOT FEATURES












Seeing through concrete
Already in existence are sensors the size of matchboxes which respond to heat, light, movement or sound; and a variety of programmes, including one called Smart Dust, are working on further miniaturising these and improving their ability to work as networks. A dozen US university teams are also developing micro-aircraft, weighing a few grams each, that imitate birds and insects and could carry sensor equipment into specific buildings or rooms.
Darpa's VisiBuilding programme, meanwhile, is making "X-ray eye" sensors that can see through concrete, locating people and weapons inside buildings. And Human ID at a Distance is working on software that can identify individual people from scans of their faces, their manner of walking or even their smell, and then track them anywhere they go.
Closely related to this drive are projects involving compu-ter simulations of urban landscapes and entire cities, which will provide backdrops essential for using the data gathered by cameras and sensors. The biggest is Urban Resolve, a simulated war against a full-scale insurgency in the Indonesian capital, Jakarta, in the year 2015.
Digitised cities
Eight square miles of Jakarta have been digitised and simulated in three dimensions. That will not surprise computer gamers, but Urban Resolve goes much further: the detail extends to the interiors of 1.6 million buildings and even the cellars and sewers beneath, and it also includes no fewer than 109,000 moving vehicles and people. Even the daily rhythms of the city have been simulated. The roads, says one commentator, "are quiet at night, but during weekday rush hours they become clogged with traffic. People go to work, take lunch breaks and visit restaurants, banks and churches."
Digitise any target city and integrate this with the flow of data from many thousands of sensors and cameras, stationary and mobile, and you have something far more powerful than the regular snapshots today's satellites can deliver. You have continuous coverage, around corners and through walls. You would never, for example, lose those mortar bombers who got out of their car and ran away.
All this brings omniscience within reach. The US web-based magazine DefenseWatch, which monitors developments in strategy and hardware, recently imagined the near-future scenario of an operation in the developing world in which a cloud of minute, networked sensors is scattered like dust over a target city using powerful fans. Directed by the sensors, unmanned drones patrol the city, building up a visual and audio picture of every street and building. "Every hostile person has been identified and located," continues the scenario. "From this point on, nobody in the city moves without the full and complete knowledge of the mobile tactical centre."
Another Darpa project, Integrated Sensor is Structure, is working on the apex of such a system: huge, unmanned communications and surveillance airships that will loiter above target areas at an altitude of 70,000 feet - far above most airline traffic - providing continuous and detailed coverage over a whole city for a year or more.
From these platforms, all the information could be fed down in real time to soldiers and commanders carrying the hand-held computers being developed by the Northrop Grumman Corporation with Darpa funding. The real aim, however, is not to expose flesh-and-blood Americans on the ground, but where possible to use robots. That way there will be no "body bag problem"; and in any case machines are better equipped than human beings to process and make use of the vast quantities of data involved.
In one sense, robots are not new: already, armed drones such as Predator, "piloted" by CIA operators from screens in Florida, have been responsible for at least 80 assassination raids in Iraq, Afghanistan, Yemen and Pakistan (killing many civilians as well). Defence contractors have also developed ground-based vehicles capable of carrying cameras and weapons into the battlefield.
But this is only the start. What will make the next generation different is that they are being designed so that they can choose, all on their own, the targets they will attack. Operating in the air and on the ground, they are being equipped with Automated Target Recognition software capable not only of comparing signals received from new-generation sensors with databases of targets, but also of "deciding" to fire guns or launch missiles automatically once there is a good "fit". Automated killing of this kind hasn't been approved by anyone yet, but it is certainly being planned. John Tirpak, editor of Air Force Magazine in the US, expects initially that humans will retain the last word, but he predicts that once robots "establish a track record of reliability in finding the right targets and employing weapons properly", the "machines will be trusted to do even that".
Planners believe, moreover, that robot warriors have a doomsday power. Gordon Johnson, a team leader on Project Alpha, which is developing robots for the US army, predicts that, if the robot's gun can return fire automatically and instantly to within a metre of a location from which its sensors have detected a gunshot, it will always kill the person who has fired. "Anyone who would shoot at our forces would die," says Johnson. "Before he can drop that weapon and run, he's probably already dead. Well now, these cowards in Baghdad would have to pay with blood and guts every time they shoot at one of our folks. The costs of poker went up significantly. The enemy, are they going to give up blood and guts to kill machines? I'm guessing not."
Again, this may sound like the plot of a B-movie, but the US military press, not a body of people given to frivolity, has been writing about it for some time. DefenseWatch, for example, also featured robots in that future war scenario involving sensors dispersed by fans. Once a complete picture of the target city is built up, the scenario predicted, "unmanned air and ground vehicles can now be vectored directly to selected targets to take them out, one by one".




AMERICA"S ROBOT ARMY



Already there are killing machines operating by remote control. Soon the machines will be able to kill on their own initiative. A new warfare is on its way

War is about to change, in terrifying ways. America's next wars, the ones the Pentagon is now planning, will be nothing like the conflicts that have gone before them.

In just a few years, US forces will be able to deal out death, not at the squeeze of a trigger or even the push of a button, but with no human intervention whatsoever. Many fighting soldiers - those GIs in tin hats who are dying two a day in Iraq - will be replaced by machines backed up by surveillance technology so penetrating and pervasive that it is referred to as "military omniscience". Any Americans involved will be less likely to carry rifles than PlayStation-style consoles and monitors that display simulated streetscapes of the kind familiar to players of Grand Theft Auto - and they may be miles from where the killing takes place.

War will progressively cease to be the foggy, confusing, equalising business it has been for centuries, in which the risks are always high, everyone faces danger and suffers loss, and the few can humble the mighty. Instead, it will become remote, semi-automatic and all-knowing, entailing less and less risk to American lives and taking place largely out of the sight of news cameras. And the danger is close to home: the coming wars will be the "war on terror" by other names, conflicts that know no frontiers. The remote-controlled war coming tomorrow to Khartoum or Mogadishu, in other words, can happen soon afterwards, albeit in moderated form, in London or Lyons.

This is no geeky fantasy. Much of the hardware and software already exists and the race to produce the rest is on such a scale that US officials are calling it the "new Manhattan Project". Hundreds of research projects are under way at American universities and defence companies, backed by billions of dollars, and Donald Rumsfeld's department of defence is determined to deliver as soon as possible. The momentum is coming not only from the relentless humiliation of US forces at the hands of some determined insurgents on the streets of Baghdad, but also from a realisation in Washington that this is the shape of things to come. Future wars, they believe, will be fought in the dirty, mazy streets of big cities in the "global south", and if the US is to prevail it needs radically new strategies and equipment.

Only fragments of this story have so far appeared in the mainstream media, but enough information is available on the internet, from the comments of those in charge and in the specialist press to leave no room for doubt about how sweeping it is, how dangerous and how imminent.

Military omniscience is the starting point. Three months ago Tony Tether, director of the Defence Advanced Research Projects Agency (Darpa), the Pentagon's research arm, described to a US Senate committee the frustration felt by officers in Iraq after a mortar-bomb attack. A camera in a drone, or unmanned aircraft, spotted the attackers fleeing and helped direct US helicopters to the scene to destroy their car - but not before some of those inside had got out. "We had to decide whether to follow those individuals or the car," he said, "because we simply didn't have enough coverage available." So some of the insurgents escaped. Tether drew this moral: "We need a network, or web, of sensors to better map a city and the activities in it, including inside buildings, to sort adversaries and their equipment from civilians and their equipment, including in crowds, and to spot snipers, suicide bombers or IEDs [improvised explosive devices] . . . This is not just a matter of more and better sensors, but, just as important, the systems needed to make actionable intelligence out of all the data."

Darpa has a host of projects working to meet those needs, often in surprising ways. One, called Combat Zones That See, aims to scatter across cities thousands of tiny CCTV cameras, each equipped with wireless communication software that will make it possible to link their data and track the movements of every vehicle on the streets. The cameras themselves will not be that different from those found in modern mobile phones

ROBOTS CONTINUE


Robots and Positioning

The Waseda Humanoid Robot Project has developed different types of robots. For human-robot symbiosis, we consider the physical function of the robots, including safety control and human-robot contact identification based on tactile recognition. Equally important to consider is their mind acquisition, such as intelligence and emotion. The developed robots are categorized as human-assisting humanoid robots, biped humanoid robots, intelligent robots, and emotional communication robots.

Japan's Graying Population

The team carefully studied the robot's symbiosis with the natural environment. Some robots are intended for autonomous movement and have self-support of energy so that they can survive in the natural environment and work closely with humans.

One of the main functions for most of these robots is the ability to define its position and to navigate. To provide a robot with the means to navigate is a huge undertaking — the team is working on only the first of many steps to solve this problem. This first step is to use various sensors — inertial sensors, satellite navigation system receivers, magnetic sensors and others — operating in coordinate domain. These sensors provide the robot with coordinates and coordinate-related parameters such as azimuth and orientation

HUMANOID ROBOT

Waseda University established the Humanoid Robotics Institute in April 2000 to promote research that aims to construct a new relationship between humans and machines in an advanced information society. As researchers, we expect that in sometime in this century robots will provide housework assistance for the elderly, a well as entertainment and other functions to improve the quality of life for humans. To make possible this symbiosis between humankind, robot, and the environment, we need to build accommodations into a house's structure and functions.


THE WASEDA: Humanoid Robot Project has developed different types of robots for different needs. Models include (from left) WABOT-1 from 1978, Wendy from 1999, and WABOT-2 from 1984. Hadaly-2 from 1997 is pictured at left.

The Waseda University WABOT-HOUSE (WAseda roBOT HOUSE) project, directed by Shigeki Sugano, was established to facilitate this research. The special test facilities for this project were created in Gifu Prefecture. In 2001 Waseda University established the WABOT-HOUSE Laboratory, located in Techno Plaza R&D site in Kagamigahara City, in the middle of an industrial region. Gifu Prefecture initiated the project with the expectation that the robotics industry can provide a significant economic boost to the region. The Japanese government took the step to designate Kagamigahara City a special robotics zone.

As a part of this project, the Waseda University team developed not only a robotic component, but also design theories of environment space, construction, and social systems for human-robot symbiosis. These theories were implemented in the structures and functions of houses and facilities to allow an integration of robot technology with various environments. Fine artists joined with robotics engineers, architects, and IT researchers to take part in the WABOT-HOUSE project. Together, these experts are trying to design an optimal system from both robotic and architectural points of view to realize truly practical symbiosis between human and robots.

As part of a system to enable the robots to navigate their environment, a GPS and pseudolite solution combined with other technologies is being investigated

ROBOT LIFE


For more than 30 years a dedicated team of scientists and engineers at Waseda University in Tokyo, Japan, have been working on a project that will integrate robots into our everyday lives. One of the main functions for most of these robots is an ability to define its position and navigate.




The first step is to use different sensors, such as inertial sensors, satellite navigation system receivers, magnetic sensors, RFID tags, and others, which are operating in coordinate domain. These sensors provide robot with coordinates, and coordinate-related parameters such as azimuth and orientation. The navigation process using these sensors can be in general terms described as an artificial navigation.

The Waseda Humanoid Robot Project, headed by Professor Shuji Hashimoto, is researching the integration of robots into our social infrastructure, or human-robot symbiosis. The advanced adaptability of robots to humanity and environment is highly required in the aging society and in the symbiotic future with natural environment (see sidebar).

Waseda University established the Humanoid Robotics Institute in April 2000 to promote research that aims to construct a new relationship between humans and machines in an advanced information society. As researchers, we expect that in sometime in this century robots will provide housework assistance for the elderly, a well as entertainment and other functions to improve the quality of life for humans. To make possible this symbiosis between humankind, robot, and the environment, we need to build accommodations into a house's structure and functions

MACHINE LIFE


MACHINE EVOLUTION
Perhaps on other planets, organisms have evolved into machine life, cyborgs, or an entirely new form of synthetic life we cannot as yet comprehend. Indeed people have seen aliens that have appeared to be machine like with body armor or complex space suits.
These two publications are presented jointly here because they create a bridge between two generations of artists who nonetheless share concerns such as abandoning a strictly instrumental use of technology to embrace random phenomena and simulating human perception. Representing this first generation since the sixties, Norman White, a professor at the Ontario College of Art (OCAD, formerly OCA) in Toronto, Canada, taught in the seventies and eighties with his colleague Doug Back to many artists from the subsequent generation (the eighties), including David Rokeby. In addition to the author contributions, the catalogues for these exhibitions include multimedia complements (a CD-ROM for "Machine Media" and "Norm’s Robots" as well as a DVD for "David Rokeby"), along with video excerpts to complete the documentation of each of the works exhibited.

The catalogue for "Machine Life" brings together three author contributions. In "Norman White, Beginning," Ihor Holubizky looks at the remarkable critical interest sparked by media arts between 1968 and 1970, a period when White’s work was first emerging. White took part in some of the major exhibits organized in the United States in the late sixties, including "Some More Beginnings: An Exhibition of Submitted Works Involving Technical Materials and Processes" in 1969 at the Brooklyn Museum (New York, U.S.). Holubizky follows White’s career from the artist’s first robotic works of the seventies to his recent projects that accentuate the playfulness of his work. The author presents White’s sometimes ambivalent views on the difficult relationship between art and technology. He concludes by underlining the entropy found in White’s work, which distinguishes it from many artistic projects modelled on the notion of technical or scientific progress. In "Taken with Surprise," Caroline Langill points out that like White, artists from the subsequent generation were interested in technology’s unpredictability. When computer units of a media artwork exchange data randomly, the end results produce a range of varied experiences for viewers. In "Encountering Machine Life," Jan Allen, like Caroline Langill, stresses the role of entropy in White’s works, which, according to this artist, "represent unusable experimental models." The artist’s process may tend toward a form of productive failure in which technology frees itself from the uses predetermined by its functions. Allen explores the years of creative exchanges between White and his former colleague Doug Back from the Ontario College of Art. The two artists shared an interest in manual work that resulted in the construction of technological components for artworks as well as a decompartmentalized approach to interactivity. A description of works presented within the exhibition follows.

The catalogue for "David Rokeby" includes two author contributions. In "Between Chaos and Order: The Garden and the Computer in the Work of David Rokeby," Su Ditta relates Rokeby’s work to a garden, a space inciting both action and contemplation. She tracks Rokeby’s career path, describing emblematic works created since the eighties. She separates projects exploiting sound and language (Very Nervous System [1986-2004], The Giver of Names [1991-] and n-cha(n)t [2001]) from projects concerned with the boundaries between computer vision, controlled by a function of analytical observation, and human vision (Watch [1995], Seen [2002] and Taken [2002]). Other works (Machine for Taking Time [2001] and Steamingmedia.org [2002]) are linked to complex devices in which a real site coexists simultaneously with several representations of the site captured at different times of the day or year. In "Interpolation: The Method of David Rokeby," Sara Diamond reports on Rokeby’s technological and aesthetic research while evoking artistic projects situated between technical application (software, protocol) and artistic intervention. Diamond emphasizes the concept of invention in Rokeby’s work and the manner in which he himself creates the technological tools required to produce his works so that these tools then exist independently (Very Nervous System, The Giver of Names). The author analyzes this last work from the perspective of a theoretical reflection on the differences between strictly human faculties and computer functions for capturing and analyzing data. Finally, Diamond comments on the artist’s use of sophisticated surveillance tools whereby the immersion in the image and the aesthetic experience are often accompanied by a paradoxical update of the technology’s essentially coercive functions

Tuesday, February 24, 2009

MECHATRONICS FUTURE


I think mechatronics generally begins with mechanical design. That’s just my perspective, it may differ in your experience. If its a power window in a car, a hard disk drive platter machine, a blender, an amusement park ride or display, a surgical robot, whatever. They all start with mechanical design, performance goals and boundary conditions that are required for the mechanical system to be useful. This is why there needs to be great emphasis on the design of software tools that are extensions to the 2D and 3D CAD products that are currently available. Obviously, if you are engaged in mechanical design, you are in a unique position to the final outcome of the design project. The mechanical design work sets the boundary conditions of what is possible. If the design is all steel and heavy components, then the speed and throughput of the design will be limited. But the same design work in aluminum will be 1/3 the weight and it will be possible to increase speed and acceleration at the same time as cost for the motor drive will be reduced. Seems counterintuitive, but it works. But the it doesn’t end with mechanical design and that’s where we all get tied up. Is there a definable process for doing this kind of design work? There are some commonalities in the machinery building community, but there are wide variations at the same time. So for software companies, it gets harder to go to the next level to create value for their customers. The design process is iterative. So there’s a clue. What can we do to speed up the design process and improve the outcomes? Simulation. Take the output of the designed components and apply animation rules to the 3D solids which are very well characterized in the software. By using computer technology to dynamically simulate the behavior of the design before its built. Just like the prototyping process, you can find out a lot of really useful information while the simulation is running, early in the design cycle and without the cost of building and exercising prototypes. Mind-blowing! You can do “What-if” all day at low cost and in very little time. So this is clearly the way to go. And engineering software companies are currently engaged in process of creating these products. But the fun is only just beginning. What happens when the products we create have to be manufactured? There are a number of complex issues in the manufacturing and product documentation realm that become very complex without software tools to help with the tasking. New software products are being created to help integrate wire harness integration in automotive assemblies. Circuit boards can be modeled ast 3D solid objects for the purpose of integration with packaging.

TYPES OF ROBOTS





Robots can be found in the manufacturing industry, the military, space exploration, transportation, and medical applications. Below are just some of the uses for robots.
Robots on Earth
Typical industrial robots do jobs that are difficult, dangerous or dull. They lift heavy objects, paint, handle chemicals, and perform assembly work. They perform the same job hour after hour, day after day with precision. They don't get tired and they don't make errors associated with fatigue and so are ideally suited to performing repetitive tasks. The major categories of industrial robots by mechanical structure are:
Cartesian robot /Gantry robot: Used for pick and place work, application of sealant, assembly operations, handling machine tools and arc welding. It's a robot whose arm has three prismatic joints, whose axes are coincident with a Cartesian coordinator.
Cylindrical robot: Used for assembly operations, handling at machine tools, spot welding, and handling at diecasting machines. It's a robot whose axes form a cylindrical coordinate system.
Spherical/Polar robot: Used for handling at machine tools, spot welding, diecasting, fettling machines, gas welding and arc welding. It's a robot whose axes form a polar coordinate system.
SCARA robot: Used for pick and place work, application of sealant, assembly operations and handling machine tools. It's a robot which has two parallel rotary joints to provide compliance in a plane.
Articulated robot: Used for assembly operations, diecasting, fettling machines, gas welding, arc welding and spray painting. It's a robot whose arm has at least three rotary joints.
Parallel robot: One use is a mobile platform handling cockpit flight simulators. It's a robot whose arms have concurrent prismatic or rotary joints.
Industrial robots are found in a variety of locations including the automobile and manufacturing industries. Robots cut and shape fabricated parts, assemble machinery and inspect manufactured parts. Some types of jobs robots do: load bricks, die cast, drill, fasten, forge, make glass, grind, heat treat, load/unload machines, machine parts, handle parts, measure, monitor radiation, run nuts, sort parts, clean parts, profile objects, perform quality control, rivet, sand blast, change tools and weld.

Outside the manufacturing world robots perform other important jobs. They can be found in hazardous duty service, CAD/CAM design and prototyping, maintenance jobs, fighting fires, medical applications, military warfare and on the farm.

Farmers drive over a billion slooooww tractor miles every year on the same ground. Their land is generally gentle, and proven robot navigation techniques can be applied to this environment. A robot agricultural harvester named Demeter is a model for commercializing mobile robotics technology. The Demeter harvester contains controllers, positioners, safeguards, and task software specialized to the needs commercial agriculture.

Some robots are used to investigate hazardous and dangerous environments. The Pioneer robot is a remote reconnaissance system for structural analysis of the Chornobyl Unit 4 reactor building. Its major components are a teleoperated mobile robot for deploying sensor and sampling payloads, a mapper for creating photorealistic 3D models of the building interior, a coreborer for cutting and retrieving samples of structural materials, and a suite of radiation and other environmental sensors.

An eight-legged, tethered, robot named Dante II descended into the active crater of Mt. Spurr, an Alaskan volcano 90 miles west of Anchorage. Dante II's mission was to rappel and walk autonomously over rough terrain in a harsh environment; receive instructions from remote operators; demonstrate sophisticated communications and control software; and determine how much carbon dioxide, hydrogen sulfide, and sulfur dioxide exist in the steamy gas emanating from fumaroles in the crater. Via satellite, Dante II sent back visual information and other data, as well as received instruction from human operators at control stations in Anchorage, Washington D.C., and the NASA Ames Research Center near San Francisco. Dante II saves volcanologists from having to enter the craters of active volcanoes. It also demonstrates the technology necessary for a robot to explore the surface of the moon or planets. That is, the robot must be able to walk on rough terrain in a harsh environment, receive instructions from remote operators about where to go next, and reach those commanded goals autonomously.

Robotic underwater rovers are used explore and gather information about many facets of our marine environment. One example of underwater exploration is Project Jeremy, a collaboration between NASA and Santa Clara University. Scientists sent an underwater telepresence remotely operated vehicle (TROV) into the freezing Arctic Ocean waters to investigate the remains of a whaling fleet lost in 1871. The TROV was tethered to the surface boat Polar Star by a cable that carried power and instructions down to the robot and the robot returned video images up to the Polar Star. The TROV located two ships which it documented using stereoscopic video cameras and control mechanisms like the ones on the Mars Pathfinder. In addition to pictures, the TROV can also collect artifacts and gather information about the water conditions. By learning how to study extreme environments on earth, scientists will be better prepared to study environments on other planets.


Robots in Space
Space-based robotic technology at NASA falls within three specific mission areas: exploration robotics, science payload maintenance, and on-orbit servicing. Related elements are terrestrial/commercial applications which transfer technologies generated from space telerobotics to the commercial sector and component technology which encompasses the development of joint designs, muscle wire, exoskeletons and sensor technology.
Today, two important devices exist which are proven space robots. One is the Remotely Operated Vehicle (ROV) and the other is the Remote Manipulator System (RMS). An ROV can be an unmanned spacecraft that remains in flight, a lander that makes contact with an extraterrestrial body and operates from a stationary position, or a rover that can move over terrain once it has landed. It is difficult to say exactly when early spacecraft evolved from simple automatons to robot explorers or ROVs. Even the earliest and simplest spacecraft operated with some preprogrammed functions monitored closely from Earth. One of the best known ROV's is the Sojourner rover that was deployed by the Mars Pathfinder spacecraft. Several NASA centers are involved in developing planetary explorers and space-based robots.

The most common type of existing robotic device is the robot arm often used in industry and manufacturing. The mechanical arm recreates many of the movements of the human arm, having not only side-to-side and up-and-down motion, but also a full 360-degree circular motion at the wrist, which humans do not have. Robot arms are of two types. One is computer-operated and programmed for a specific function. The other requires a human to actually control the strength and movement of the arm to perform the task. To date, the NASA Remote Manipulator System (RMS) robot arm has performed a number of tasks on many space missions-serving as a grappler, a remote assembly device, and also as a positioning and anchoring device for astronauts working in space

HAPTIC ROBOTS


The main purpose of this project is to design and build a functional mechatronics system that consists of a chair that responds to a simulated change in the environment, by rotating about one axis.

Our chair is a possible method used to communicate using haptics to a vehicle’s driver. Currently drivers use two senses to detect hazards on the road; hearing and sight. When implemented fully, our project would allow drivers to detect hazards on the road using their sense of touch as well. For this prototype we focused on the forward and backward movement of the driver's chair to simulate how the chair would react to obstacles and driving conditions.
FOR MORE INFORMATION CLICK ON THAT
http://www.mech.northwestern.edu/hartmann/ME333Files/ME333_FinalWebsites/HapticChair/ME333/index.htm

SUMMARY:
Our project consists of a chair and the actuation of the chair. We focused only on one degree of actuation for this project, the motion of tilting the chair forward and backwards. The chair was modified to allow free movement back and forward, and is able to support up to 200 lbs of weight when static. Despite the free range of motion allowed by the chair, we limited our range of motion of -15 degrees to +15 degrees to ensure the person feels secure. To accomplish this actuation, we implemented a system that will rotate the chair about a support rod positioned at the base of the chair. The chair was moved by a Maxon Motor, which required a something amp and a 12V/10Amp power pack. The motion of the chair was monitored and controlled by an encoder attached to the motor as well an additional encoder used to simulate position or velocity of the vehicle. In addition, we sensed if the chair is near the end of the range of motion with the use of microswitches. With these two sensor systems, we are able to pinpoint the chair position, and ensure the safety of the occupant.

LIST OF Accredited Engineering Qualifications Granted by Engineering Universities/Institutions in Pakistan)


ISLAMABAD
Air University, Islamabad
Bachelor of Electronics Engineering (Intake of Batch Fall 2002 only).
Bachelor of Electrical Engineering (Electronic or Telecommunication) From Intake of Batch Fall-2003 upto 2004.
Bachelor of Mechatronics Engineering (Intake of Batch Fall-2003 only).
Bahria Institute of Management & Computer Sciences, Islamabad Campus (Bahria University, Islamabad)
Bachelor of Computer Engineering (From Intake of Batch Fall-2001 upto 2005, excluding of Batch-2003 which was not inducted by the university).
Bachelor of Software Engineering (Upto Intake of Batch Fall-2002).
Centre of Advanced Study in Engineering (CASE) Islamabad (UET, Taxila)
B.Sc. Electrical Engineering (with specializations in Computer or Telecommunication)- Intake of Batch 2004 only.
COMSATS Institute of Information Technology (CIIT), Islamabad Campus (COMSATS Institute Of Information Technology (CIIT), Islamabad)
B.Sc. Electrical (Telecommunication) Engineering (From Intake of Batch Fall-2002 to Fall-2003 only).
B.Sc. Computer Engineering (w.e.f. the Intake Year 1999 subject to review in 2004).
Institute of Space Technology (IST) Islamabad
Bachelor of Science (B.S.) in Communication Systems Engineering (From Intake of Batch Fall-2002 upto Fall-2004).
Bachelor of Science (B.S.) in Aerospace Engineering (From Intake of Batch Fall-2002 upto Fall-2006).
International Islamic University, Islamabad.
Bachelor of Science (B.S.) in Electronic Engineering (From Intake of Batch Fall-2003 upto Fall-2004).
Islamic International Engineering College, Islamabad (Riphah International University, Islamabad)
B.Sc Electrical (Communication) Engineering - (From Intake of Batch 1999 upto Fall-2003).
B.Sc. Electrical Engineering (Communication) - Intake Year 1999 to Intake Year 2002 (Formerly Islamic International Engineering College (UET Taxila), Rawalpindi).
Muhammad Ali Jinnah University, Islamabad Campus (Muhammad Ali Jinnah University, Karachi)
B.Sc. Electronic Engineering (From Intake of Batch Fall-2002 upto Fall-2003).
National University of Computer & Emerging Sciences, Islamabad.
Bachelor of Science (B.S.) in Telecommunication Engineering (From Intake of Batch Fall-2003 upto Fall-2004).
NUST Institute of Information Technology (NIIT), Rawalpindi (NUST Islamabad)
Bachelor of Information & Communication Systems Engineering (Intake of Batch Fall-2003 only).
Bachelor of Engineering (Electronic Engineering) (Intake of Batch Fall-2004 only).
PUNJAB
COMSATS Institute of Information Technology (CIIT), Lahore Campus (COMSATS Institute of Information Technology (CIIT), Islamabad)
B.Sc. Computer Engineering (From Intake of Batch Spring 2002 to Fall 2002).
COMSATS Institute of Information Technology, Wah Campus (COMSATS Institute Of Information Technology (CIIT), Islamabad).
B.Sc. Computer Engineering (From Intake of Batch Fall-2001 upto Fall-2003).
College of Agriculture, Multan ( href="http://www.bzu.edu.pk/">Bahauddin Zakariya University, Multan)
B.Sc. Agricultural Engineering (Intake of Batch 2004 only).
College of Electrical & Mechanical Engineering, Rawalpindi Campus (National University of Sciences and Technology, Islamabad)
B.E. (Electrical and Mechanical) From Intake of Batch 1993 to 2005.
B.E. Computer (From Intake of Batch 1996 to 2005).
B.E. Mechatronics (From Intake of Batch 1998 to 2005).
Institute of Quality and Technology Management, Lahore (The University of Punjab, Lahore)
B.Sc. Industrial Engineering & Management (Intake of Batch 2004 only).
Foundation University, Rawalpindi Campus (Foundation University Islamabad)
B.Sc. Software Engineering (From Intake of Batch 2001 to 2003).
Institute of Chemical Engineering and Technology, Lahore (University of the Punjab, Lahore) (Quaid-e-Azam Campus)
B.Sc. Chemical Engineering (Upto Intake of Batch 2003 only).
B.Sc. Metallurgy & Materials Science (Upto Intake of Batch 2003 only).
M.Sc. Tech upto 1968 thereafter B.Sc. Engg (Chemical, Mining, Metallurgical Engineering).
Military College of Signals, Rawalpindi Campus (National University of Sciences and Technology, Islamabad)
B.E. Electrical (Communication) From Intake of Batch 1993 to 2005.
B.E. Computer Software (From Intake of Batch 1996 to 2003).
National Textile University, Faisalabad (Former National College of Textile Engineering, University of Engineering and Technology, Lahore)
B.Sc. Textile Engineering. Re-accreditation is under process.
National University of Computer & Emerging Sciences, Islamabad (Lahore Campus)
B.Sc. Computer Engineering (Intake of Batch Fall-2003 only).
B.Sc. Telecommunication Engineering (Intake of Batch Fall-2003 only).
NFC Institute of Engineering & Fertilizer Research, Faisalabad (University of Engineering and Technology, Lahore)
B.Sc. Chemical Engineering, (Morning) From Intake of Batch 1998 upto 2003.
B.Sc. Electrical Engineering (From intake of Batch 2003 upto 2004).
NFC Institute of Engineering and Technological Training, Multan (Bahauddin Zakariya University Multan)
B.Sc. Computer System Engineering (From Intake of Batch 2001 to 2003).
B.Sc. Electronic Engineering (From Intake of batch 2001 to 2003).
B.Sc. Chemical Engineering (Upto Intake of Batch 2005).
The University of Central Punjab, Lahore
B.Sc. Electrical Engineering (From Intake of Batch Fall-2003 upto Fall-2004).
The University of Lahore
B.Sc. Electrical Engineering (From Intake of Batch Fall 2003 to Fall 2005).
University College of Engineering and Technology, Multan (Bahauddin Zakariya University, Multan)
B.Sc. Civil Engineering (From Intake of Batch 1994 upto 2003).
B.Sc. Electrical Engineering (From Intake of Batch 1997 upto 2003).
University College of Textile Engineering (BZU, Multan)
B.Sc. Textile Engineering (Intake Batch 2004 only).
University College of Engineering and Technology, Bahawalpur (The Islamia University of Bahawalpur)
B.Sc. Electronic Engineering (From Intake of Batch 2003 upto 2004).
University of Agriculture, Faisalabad
B.Sc. Agricultural Engineering (Upto Intake of Batch 2004).
University of Engineering and Technology, Taxila Campus (University of Engineering and Technology, Taxila)
B.Sc. Engineering (Civil, Electrical, and Mechanical) - Upto Intake of Batch 2006.
B.Sc. Computer Engineering (From Intake of Batch 2001 upto 2004).
B.Sc. Software Engineering (From Intake of Batch 2003 upto 2005).
University of Engineering and Technology, Lahore
B.Sc. Computer Engineering (From Intake of Batch 2003 upto 2004).
B.Sc. Building & Architectural Engineering (Upto Intake of Batch 2003 only).
B.Sc. Petroleum & Gas Engineering (Upto Intake of Batch 2005).
B.Sc. Geological Engineering (From Intake of Batch 2001 upto 2003).
B. Sc. Architectural Engineering (Intake of Batch 2002 only).
B.Sc. Chemical Engineering (Polymer) From Intake of Batch 2002 upto 2004.
B.Sc. Engineering (Civil, Electrical, Mechanical, Mining, and Metallurgical & Materials) - Upto Intake of Batch 2005.
B.Sc. Industrial & Manufacturing Engineering (From Intake of Batch 1999 upto 2005).
B.Sc. Mechatronics & Control Engineering (From Intake of Batch 2003 upto 2005).
B.Sc. Transportation Engineering (From Intake of Batch 2002 upto 2003).
B.Sc. Chemical Engineering.
NWFP
CECOS University of Information Technology and Emerging Sciences, Peshawar
B.Sc. Civil Engineering Re-accredited for one year for entry year 2003.
B.Sc. Electrical Engineering Re-accredited for one year for entry year 2003.
B.Sc. Civil Engineering. Intake Year 1997. The Petitioners of W.P No. 119/04.
B.Sc. Electrical Engineering. Intake Year 1997. The Petitioners of W.P No. 119/04.
College of Aeronautical Engineering, Risalpur Campus (National University of Sciences and Technology, Islamabad)
B.E. (Aerospace and Avionics) Upto Intake of Batch 2005.
COMSATS Institute of Information Technology Abbottabad Campus (COMSATS Institute of Information Technology, Islamabad)
B.Sc. Computer Engineering (From Intake of Batch Fall 2001 to Fall 2005).
B.Sc. Electronics Engineering (Intake of Batch Fall 2003 only).
Gandhara Institute of Science & Technology, PGS Engineering College (NWFP University of Engineering & Technology, Peshawar)
B.Sc. Civil Engineering (From Intake of Batch 2001 to 2003).
B.Sc. Electrical Engineering (From Intake of Batch 2001 upto 2002).
Ghulam Ishaque Khan Institute of Engineering Sciences and Technology, Topi - Swabi
B.Sc. Mechanical Engineering (Upto Intake Year 2004).
B.Sc. Electronic Engineering (Upto Intake of Batch 2003).
B.Sc. Engineering (Metallurgy & Materials, and Computer Systems)(From Intake of Batch 1993 to 2005).
B.Sc. Computer Software Engineering (Intake of Batch 2003 only).
B.Sc. Engineering Sciences (with majors in Modelling, and Simulation, or laser & Opto Electronics and Semi - Conductors and Super Conducting Devices).
Military College of Engineering, Risalpur Campus (href="http://www.nust.edu.pk/">National University of Sciences and Technology, Islamabad)
B.E. Civil.
National University of Computer & Emerging Sciences, Islamabad (Peshawar Campus)
Bachelor of Science (B.S.) in Telecommunication Engineering (From Intake of Batch Fall-2003 upto Fall-2004).
NWFP University of Engineering & Technology, Peshawar Campus (NWFP University of Engineering & Technology, Peshawar)
B.Sc. Chemical Engineering (From Intake of Batch 1995 upto 2005).
B.Sc. Civil Engineering (Upto Intake Year 2004).
B.Sc. Electrical Engineering (Upto Intake of Batch 2005).
B.Sc. Computer Information Systems Engineering (from Intake Year 1999 to 2000).
B.Sc. Computer System Engineering (From Intake of Batch 2001 upto 2004).
B.Sc. Agricultural Engineering (Upto Intake of Batch 2005).
B.Sc. Mining Engineering (Upto Intake of Batch 2005).
B.Sc. Mechanical Engineering (Upto Intake of Batch 2005).
N.W.F.P. University of Engineering and Technology, Peshawar (Mardan Campus)
B.Sc. Telecommunication Engineering (From Intake of Batch 2002 upto Fall-2004).
B.Sc. Computer Software Engineering (From Intake of Batch 2003 upto Fall-2004).
N.W.F.P. University of Engineering and Technology, Peshawar (Bannu Campus)
B.Sc. Engineering (Civil, and Electrical ) - From Intake of Batch 2002 upto 2004.
Peshawar College of Engineering, Peshawar (NWFP University of Engineering & Technology, Peshawar)
B.Sc. Computer Systems Engineering (From Intake of Batch 2003 upto 2004).
B.Sc. Electrical Engineering (From Intake of Batch 1999 upto 2002).
SINDH
Bahria Institute of Management & Computer Sciences Karachi Campus (Bahria Institute of Management & Computer Sciences, Islamabad)
B.Sc. Computer Engineering (Accredited for Three Years for Intake Years Spring 2001 upto Fall 2003).
B.Sc. Software Engineering (Accredited for Three Years for Intake Years Fall 2000 upto Fall 2002).
Dawood College of Engineering and Technology, Karachi Campus (Mehran University of Engineering and Technology, Jamshoro)
B.E. Industrial & Management (Intake of Batch 2005 only).
B.E. Electronic (From intake of batch year 2000, including 1996-97 batch in which 1995-96 batch was merged, upto intake of batch 2004).
B.E. Chemical (From intake of batch year 2000, including 1996-97 batch in which 1995-96 batch was merged, upto intake of batch 2004).
B.E. Metallurgical (From Intake of Batch year-2000, including 1996-97 batch in which 1995-96 batch was merged, and intake of batch 2004 only).
Hamdard Institute of Information Technology (HIIT) Karachi (Hamdard University, Karachi)
B.E. Telecommunication (Intake of Batch 2004 only).
B.E. Computer System (From Intake of Batch Fall-2000 upto 2004).
B.E. Electronic (From Intake of Batch Fall-2004 upto 2005).
Iqra University, Karachi Campus (Iqra University, Karachi)
B.E. Electronics (Intake Year 2002).
Institute of Industrial Electronics Engineering (PCSIR), Karachi (NED-University of Engineering and Technology, Karachi)
B.E. Industrial Electronics (Upto Intake Year 2002).
Mehran University of Engineering and Technology, Jamshoro
B.E. Civil.
B.E. Electrical.
B.E. Mechanical.
B.E. Electronics.
B.E. Industrial.
B.E. Pet-gas.
B.E. Computer System.
B.E. Chemical.
B.E. Metallurgical.
B.E. Mining.
B.E. Textile (Awarded to persons admitted in 1994 and after, subject to review after one year i.e. in 2003).
B.E. Telecommunication (From Intake of Batch 2001 to 2005).
B.E. Software (From Intake of Batch 2002 upto 2003).
B.E. Bio-Medical (From Intake of Batch 2003 upto 2004 only).
National University of Computer & Emerging Sciences, Islamabad (Karachi Campus)
Bachelor of Science (B.S.) in Engineering (Computer, and Telecommunication) - From Intake of Batch Fall-2003 upto Fall-2004.
NED University of Engineering and Technology, Karachi
B.E. Electronic (From Intake of Batch 1998-99 upto 2006-07).
B.E. Electrical (Upto Intake of Batch 2006-07).
B.E. Petroleum (Intake of Batch 2005-06 only).
B.E. Urban (Intake of Batches 2001-02 upto 2003-04).
B.E. Telecommunication (Intake of Batch 2004-05 only).
B.E. (Civil, and Mechanical).
B.E. Computer Systems (Upto Intake of Batch 1998-99).
B.E. Computer & Information Systems (From Intake of Batch 1999-2000 to 2004-05).
B.E. Textile (From Intake of Batch 1995-96 to 2005-06).
B.E. Industrial & Manufacturing (From Intake of Batch 1999-2000).
B.E. (Chemical, Metallurgical, Electronics and Industrial)- awarded to persons enrolled with NED UET upto 1995-96 batch for programs offered at Dawood College of Engineering & Technology, Karachi.
Pakistan Navy Engineering College, Karachi Campus (National University of Sciences and Technology, Islamabad)
B.E. (Mechanical, Electrical) Upto Intake of Batch 2005.
B.E. Electronics (From Intake of Batch 1998 to 2005).
Pakistan Air Force-Karachi Institute of Economics & Technology (PAF-KIET), Karachi
B.E. Electronics (Intake Fall Year 2003 to spring 2004).
Plastics Technology Centre, Karachi (Hamdard University, Karachi)
B.E. Polymer (Intake of Batch 2004 only).
Quaid-e-Awam University of Engineering, Science & Technology, Nawabshah
B.E. Computer Systems (From Intake of Batch 1997 upto 2004).
B.E. Civil (Upto Intake of Batch 2004).
B.E. Electrical (Upto Intake of Batch 2004).
B.E. Mechanical (Upto Intake of Batch 2004).
Sir Syed University of Engineering & Technology, Karachi
B.S. Engineering (Computer, and Electronic) - Upto Intake of Batch 2004.
B.S. Civil Engineering (Upto Intake of Batch 2004).
B.S. Bio-Medical Engineering (Upto Intake of Batch 2004).
Sindh Agriculture University, Tandojam
B.E. (Agricultural).
Usman Institute of Technology, Karachi (Hamdard University, Karachi)
B.E. Computer Systems (From Intake of Batch 1995 upto 2004).
B.E. Electronic (Industrial) (Intake of Batch 2004-B only).
B.E. Electronic (From Intake of Batch 1995 upto 2004).
B.E. Telecommunication (Intake of Batch 2004 only).
BALOCHISTAN
Balochistan University of Engineering & Technology, Khuzdar
B.E. Civil (Upto Intake of Batch 2003).
B.E. Electrical (Upto Intake of Batch 2003).
B.E. Mechanical (Upto Intake of Batch 2003).
B.E. Computer System (Intake of Batch 2003 only).
Balochistan University of Information Technology and Management Sciences, Quetta (Takatoo Campus)
B.Sc. Computer Engineering (From Intake of Batch 2003 upto 2004).
B.Sc. Electronic Engineering (From Intake of Batch 2003 upto 2004)

MECHATRONICS INTERNATIONAL


The Mechatronics International continues to grow strongly and will remain focused on continuity and a well established cooperate culture. Our aim is to work with our customers and suppliers to provide them our excellent support to build long lasting business relationship.
We are the member of "Lahore Chamber of Commerce and Industries (LCCI), Pakistan"

Our main working areas are:

1- Brazilian iron ore, HMS 1&2, Used rail, Metal waste, Bauxite
and other Materials.
2- Manganese Ore
3- Urea
4- Cement
5- Coal
6- Sugar
7- New/used Machinery (Electronics/Mechanical)
8- Business development in Pakistan.
9- Software development from Pakistan.
10- Miscellaneous.

WHAT IS A MECHATRONICS


Mechatronics (or Mechanical and Electronics Engineering) is the synergistic combination of mechanical engineering, electronic engineering, controls engineering and computer engineering to create useful products. The purpose of this interdisciplinary engineering field is the study of automata from an engineering perspective and serves the purposes of controlling advanced hybrid systems. The word itself is a combination of 'Mechanics' and 'Electronics'.
Engineering cybernetics deals with the question of control engineering of mechatronic systems. It is used to control or regulate such a system (see control theory). Through collaboration the mechatronic modules perform the production goals and inherit flexible and agile manufacturing properties in the production scheme. Modern production equipment consists of mechatronic modules that are integrated according to a control architecture. The most known architectures involve hierarchy, polyarchy, hetaerachy (often misspelled as heterarchy) and hybrid. The methods for achieving a technical effect are described by control algorithms, which may or may not utilize formal methods in their design. Hybrid-systems important to Mechatronics include production systems, synergy drives, planetary exploration rovers, automotive subsystems such as anti-lock braking systems, spin-assist and every day equipment such as autofocus cameras, video, hard disks, CD-players.


HISTROY:

Mechatronics is centred on mechanics, electronics, control engineering, computing, molecular engineering (from nanochemistry and biology) which, combined, make possible the generation of simpler, more economical, reliable and versatile systems. The portmanteau "Mechatronics" was first coined by Mr. Tetsuro Mori, a senior engineer of the Japanese company Yaskawa, in 1969. Mechatronics may alternatively be referred to as "electromechanical systems" or less often as "control and automation engineering".