Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Droning On Secure From the Homeland...
#1
Pilotless drones kill 32 in Pakistan Province near Afghanistan

Wait didn't I read this about five days ago? Is it the same attack or a second?

There is so little coverage and its so sporadic that I don't know right now. Just so our pilotless attack piloted by a teenage video game expert made Anti-Terrorist Drone driver, who bombs many on a border I will never see, and I hear about sporadically from Pakistan Intelligence??? From US intelligence.?? Oh well its pilotless, so nobody will get hurt, I guess everyone is right not to make a big deal about it....... what is for supper? """""WHY DO THEY HATE US?"""""Cheers
Reply
#2
Yep, you're right Nathaniel, this is the second drone attack in five days. The reason for it seems to have been spelled out HERE
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#3
65? dead killed by remote. Less coverage on CNN than the Israeli Tennis player banned from Dubai. They hate us for our drone freedom.

I guess its only a war if US lives are at risk, even one. At this rate this could end up in the quietest genocide since Indonesia 1965.
Reply
#4
Just wait until they use these kinds of drones in the UK, Europe, 'Down-Under', and N.A.!.....it won't be long now!.....:marchmellow:

Sure gives lots of time for a court case, with presumption of innocence, production of witnesses and evidence and letting justice take its course - it is just summary exectution of anyone suspected of being a problem. And if they make an mistake on the target or the drone makes a mistake - well 'sorry' should do. :eviltongue: I'd even go so far as to say the Oligarchy and their troops [private, intelligence and military, etc.] have only one think left to offer most humans in all situations - that being 'sorry' and for the Americans, 'sorry, and have a nice day!'
Reply
#5
Features » February 15, 2009

Attack of the Killer Robots

The Pentagon’s dream of a techno army is doomed to fail.

By Eric Stoner

[Image: feat_stoner.jpg&w=310]
‘You don’t want your defenses to bankrupt you. If it costs $100,000 to defeat a $500 roadside bomb, that doesn’t sound like such a good strategy—as pretty as it may look on YouTube.’
One of the most captivating storylines in science fiction involves a nightmarish vision of the future in which autonomous killer robots turn on their creators and threaten the extinction of the human race. Hollywood blockbusters such as Terminator and The Matrix are versions of this cautionary tale, as was R.U.R. (Rossum’s Universal Robots), the 1920 Czech play by Karel Capek that marked the first use of the word “robot.”
In May 2007, the U.S. military reached an ominous milestone in the history of warfare—one that took an eerie step toward making this fiction a reality. After more than three years of development, the U.S. Army’s 3rd Infantry Division based south of Baghdad, deployed armed ground robots.
Although only three of these weaponized “unmanned systems” have hit Iraq’s streets, to date, National Defense magazine reported in September 2007 that the Army has placed an order for another 80.
A month after the robots arrived in Iraq, they received “urgent material release approval” to allow their use by soldiers in the field. The military, however, appears to be proceeding with caution.
According to a statement by Duane Gotvald, deputy project manager of the Defense Department’s Robotic Systems Joint Project Office, soldiers are using the robots “for surveillance and peacekeeping/guard operations” in Iraq. By all accounts, robots have not fired their weapons in combat since their deployment more than a year and a half ago.
But it is only a matter of time before that line is crossed.
Future fighting force?

For many in the military-industrial complex, this technological revolution could not come soon enough.
Robots’ strategic impact on the battlefield, however—along with the moral and ethical implications of their use in war—have yet to be debated.
Designed by Massachusetts-based defense contractor Foster-Miller, the Special Weapons Observation Remote Direct-Action System, or SWORDS, stands three feet tall and rolls on two tank treads.
It is similar to the company’s popular TALON bomb disposal robot—which the U.S. military has used on more than 20,000 missions since 2000—except, unlike TALON, SWORDS has a weapons platform fixed to its chassis.
Currently fitted with an M249 machine gun that fires 750 rounds per minute, the robot can accommodate other powerful weapons, including a 40 mm grenade launcher or an M202 rocket launcher.
Five cameras enable an operator to control SWORDS from up to 800 meters away with a modified laptop and two joysticks. The control unit also has a special “kill button” that turns the robot off should it malfunction. (During testing, it had the nasty habit of spinning out of control.)
Developed on a shoestring budget of about $4.5 million, SWORDS is a primitive robot that gives us but a glimpse of things to come. Future models—including several prototypes being tested by the military—promise to be more sophisticated.
Congress has been a steady backer of this budding industry, which has a long-term vision for technological transformation of the armed forces.
In 2001, the Defense Authorization Act directed the Pentagon to “aggressively develop and field” robotic systems in an effort to reach the ambitious goal of having one-third of the deep strike aircraft unmanned within 10 years, and one-third of the ground combat vehicles unmanned within 15 years.
To make this a reality, federal funding for military robotics has skyrocketed. From fiscal year 2006 through 2012, the government will spend an estimated $1.7 billion on research for ground-based robots, according to the congressionally funded National Center for Defense Robotics. This triples what was allocated annually for such projects as recently as 2004.
The centerpiece of this roboticized fighting force of the future will be the 14 networked, manned and unmanned systems that will make up the Army’s Future Combat System—should it ever get off the ground. The creation of the weapons systems is also one of the most controversial and expensive the Pentagon has ever undertaken.
In July 2006, the Defense Department’s Cost Analysis Improvement Group estimated that its price tag had risen to more than $300 billion—an increase of 225 percent over the Army’s original $92 billion estimate in 2003, and nearly half of President Obama’s proposed stimulus package.
‘War in a can’

Despite the defense world’s excitement and the dramatic effect robots have on how war is fought, U.S. mainstream media coverage of SWORDS has been virtually nonexistent.
Worse, the scant attention these robots have received has often been little more than free publicity. Time magazine, for example, named SWORDS one of the “coolest inventions” of 2004. “Insurgents, be afraid,” is how its brief puff piece began. And while most articles are not that one-sided, any skepticism is usually mentioned as a side note.
On the other hand, prior to the deployment of SWORDS, numerous arguments in their defense could regularly be found in the press. According to their proponents—generally the robot’s designers or defense officials—robots will not have any of the pesky weaknesses of flesh-and-blood soldiers.
“They don’t get hungry,” Gordon Johnson, who headed a program on unmanned systems at the Joint Forces Command at the Pentagon told the New York Times in 2005. “They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes.”
Ronald Arkin, a leading roboticist at Georgia Tech, whose research the Defense Department funds, argues without a sense of irony that autonomous robots will be more humane than humans. Atrocities like the massacre by U.S. troops in Haditha, Iraq, would be less likely with robots, he told The Atlanta in November 2007, because they won’t have emotions that “cloud their judgment and cause them to get angry.”
Robots are also promoted as being cost-effective. On top of the annual salary and extra pay for combat duty, the government invests a great deal in recruiting, training, housing and feeding each soldier. Not to mention the costs of healthcare and death benefits, should a soldier be injured or killed.
By comparison, the current $245,000 price tag on SWORDS—which could drop to $115,000 per unit if they are mass-produced—is a steal.
After attending a conference on military robotics in Baltimore, journalist Steve Featherstone summed up their function in Harper’s in February 2007: “Robots are, quite literally, an off-the-shelf war-fighting capability—war in a can.”
And the most popular talking point in favor of armed robots is that they will save U.S. soldiers’ lives. To drive the point home, proponents pose this rhetorical question: Would you rather have a machine get blown up in Iraq, or your son or daughter?
Remove from reality

At first glance, these benefits of military robots sound sensible. But they fall apart upon examination.
Armed robots will be far from cost effective. Until these machines are given greater autonomy—which is currently a distant goal—the human soldier will not be taken out of the loop. And because each operator can now handle only one robot, the number of soldiers on the Pentagon’s payroll will not be slashed anytime soon. More realistically, SWORDS should best be viewed as an additional, expensive remote-controlled weapons system at the military’s disposal.
A different perspective is gained when the price of the robot is compared with the low-tech, low-cost weaponry that U.S. forces face on a daily basis in Iraq.
“You don’t want your defenses to be so expensive that they’ll bankrupt you,” says Sharon Weinberger, a reporter for Wired’s Danger Room blog. “If it costs us $100,000 to defeat a $500 roadside bomb, that doesn’t sound like such a good strategy—as pretty as it may look on YouTube and in press releases.”
The claim that robots would be more ethical than humans similarly runs contrary to both evidence and basic common sense.
Lt. Col. Dave Grossman writes in his 1996 book On Killing that despite the portrayal in our popular culture of violence being easy, “There is within most men an intense resistance to killing their fellow man. A resistance so strong that, in many circumstances, soldiers on the battlefield will die before they can overcome it.”
One of the most effective solutions to this quandary, the military has discovered, is to introduce distance into the equation. Studies show that the farther the would-be killer is from the victim, the easier it is to pull the trigger. Death and suffering become more sanitized—the humanity of the enemy can be more easily denied. By giving the Army and Marines the capability to kill from greater distances, armed robots will make it easier for soldiers to take life without troubling their consciences.
The Rev. G. Simon Harak, an ethicist and the director of the Marquette University Center for Peacemaking, says, “Effectively, what these remote control robots are doing is removing people farther and farther from the consequences of their actions.”
Moreover, the similarity that the robots have to the life-like video games that young people grow up playing will blur reality further.
“If guys in the field already have difficulties distinguishing between civilians and combatants,” Harak asks, “what about when they are looking through a video screen?”
Rather than being a cause for concern, however, Maj. Michael Pottratz at the Army’s Armament Research, Development and Engineering Center in Picatinny Arsenal, N.J., says in an e-mail that developers are in the process of making the control unit for the SWORDS more like a “Game Boy type controller.”
It is not only possible but likely that a surge of armed robots would lead to an increase in the number of civilian casualties, not a decrease.
The supposed conversation-ender that armed robots will save U.S. lives isn’t nearly as clear as it is often presented, either. “If you take a narrow view, fewer soldiers would die,” Harak says, “but that would be only on the battlefield.”
As happens in every war, however, those facing new technology will adapt to them.
“If those people being attacked feel helpless to strike at the robots themselves, they will try to strike at their command centers,” Harak says, “which might well be back in the United States or among civilian centers. That would then displace the battlefield to manufacturing plants and research facilities at universities where such things are being invented or assembled… The whole notion that we can be invulnerable is just a delusion.”
The new mercenaries

Even if gun-totting robots could reduce U.S. casualties, other dangerous consequences of their use are overlooked.
Frida Berrigan, a senior program associate at the New America Foundation’s Arms and Security Initiative and In These Times contributing editor, argues that similar to the tens of thousands of unaccountable private security contractors in Iraq, robots will help those in power “get around having a draft, higher casualty figures and a real political debate about how we want to be using our military force.”
In effect, by reducing the political capital at stake, robots will make it far easier for governments to start wars in the first place.
Since the rising U.S. death toll appears to be the primary factor that has turned Americans against the war—rather than its devastating economic costs or the far greater suffering of the Iraqi people—armed robots could also slow the speed with which future wars are brought to an end.
When Sen. John McCain (R-Ariz.) infamously remarked that he would be fine with staying in Iraq for 100 years, few noted that he qualified that statement by saying, “as long as Americans are not being injured or harmed or wounded or killed.”
Robot soldiers will be similar to mercenaries in at least one more respect. They both serve to further erode the state’s longstanding monopoly on the use of force.
“If war no longer requires people, and robots are able to conduct war or acts of war on a large scale, then governments will no longer be needed to conduct war,” Col. Thomas Cowan Jr. wrote in a March 2007 paper for the U.S. Army War College. “Non-state actors with plenty of money, access to the technology and a few controllers will be able to take on an entire nation, particularly one which is not as technologically advanced.”
This may not be farfetched.
In December 2007, Fortune magazine told the story of Adam Gettings, “a 25-year-old self-taught engineer,” who started a company in Silicon Valley called Robotex. Within six months, the company built an armed robot similar to the SWORDS—except that it costs a mere $30,000 to $50,000. And these costs will drop.
As this happens, and as the lethal technology involved becomes more accessible, Noel Sharkey, a professor of Artificial Intelligence and Robotics at the University of Sheffield in the United Kingdom, warns that it will be only a matter of time before extremist groups or terrorists develop and use robots.
Evidence now suggests that using armed robots to combat insurgencies would be counterproductive from a military perspective.
One study, published in the journal International Organization in June 2008, by Jason Lyall, an associate professor of international relations at Princeton, and Lt. Col. Isaiah Wilson III, who was the chief war planner for the 101st Airborne Division in Iraq and who currently teaches at West Point, looks at 285 insurgencies dating back to 1800.
After analyzing the cases, Lyall and Wilson conclude that the more mechanized a military is, the lower its probability of success.
“All counterinsurgent forces must solve a basic problem: How do you identify the insurgents hiding among noncombatant populations and deal with them in a selective, discriminate fashion?” Lyall writes in an e-mail.
To gain such knowledge, troops must cultivate relationships with the local population. This requires cultural awareness, language skills and, importantly, a willingness to share at least some of the same risks as the local population.
The Counterinsurgency Field Manual, which was released in December 2006 and co-authored by Gen. David Petraeus, would seem to agree.
“Ultimate success in COIN [counterinsurgency] is gained by protecting the populace, not the COIN force,” the manual states. “If military forces remain in their compounds, they lose touch with the people, appear to be running scared, and cede the initiative to the insurgents.”
Mechanized militaries, however, put greater emphasis on protecting their own soldiers. Consequently, Lyall and Wilson argue in their study that such forces lack the information necessary to use force discriminately, and therefore, “often inadvertently fuel, rather than suppress, insurgencies.”
Given such findings, deploying armed robots in greater numbers in Iraq or Afghanistan would likely only enflame resistance to the occupation, and, in turn, lead to greater carnage.
To understand this point, put yourself in the shoes of an Iraqi or Afghani. How could seeing a robot with a machine gun rumble down your street or point its weapon at your child elicit any reaction other than one of terror or extreme anger? What would you do under such circumstances? Who would not resist? And how would you know that someone is controlling the robot?
For all the Iraqis know, SWORDS is the autonomous killer of science fiction—American-made, of course.
The hope that killer robots will lower U.S. casualties may excite military officials and a war-weary public, but the grave moral and ethical implications—not to mention the dubious strategic impact—associated with their use should give pause to those in search of a quick technological fix to our woes.
By distancing soldiers from the horrors of war and making it easier for politicians to resort to military force, armed robots will likely give birth to a far more dangerous world.


Eric Stoner is a New York-based contributor to Foreign Policy in Focus. His articles have appeared in The Nation, NACLA, The Indypendent and The Huffington Post.

More information about Eric Stoner





[URL="http://www.inthesetimes.com/article/4143/gaza_in_the_crosshairs/"]
[/URL]

[Image: 817-grey.gif] [Image: ;sz=300x250;ord=123456789?] [Image: adlog.php?bannerid=242&clientid=109&zone...011e4fa4ff]




[Image: adlog.php?bannerid=245&clientid=111&zone...038a41fa77]
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#6
Quote:If it costs $100,000 to defeat a $500 roadside bomb, that doesn’t sound like such a good strategy

Bt when you're busy making loadsa bucks, who cares about the cost - or a few soldiers killed because of a $500 (or much less?) IED?

If the US wanted to be an efficient war fighting machine it would find other solutions than throwing greenbacks at every obstacle it meets. But making bucks is the name of the game.
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#7
Well, yes. There is that point.

Reminds me of the hundreds of thousands of dollars that NASA spent trying to create a pen that wrote in zero gravity. The USSR used pencils.
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#8
Google Earth reveals secret history of US base in Pakistan




The US was secretly flying unmanned drones from the Shamsi airbase in Pakistan's southwestern province of Baluchistan as early as 2006, according to an image of the base from Google Earth.
The image — that is no longer on the site but which was obtained by The News, Pakistan's English language daily newspaper — shows what appear to be three Predator drones outside a hangar at the end of the runway. The Times also obtained a copy of the image, whose co-ordinates confirm that it is the Shamsi airfield, also known as Bandari, about 200 miles southwest of the Pakistani city of Quetta.
An investigation by The Times yesterday revealed that the CIA was secretly using Shamsi to launch the Predator drones that observe and attack al-Qaeda and Taleban militants around Pakistan's border with Afghanistan.

[Image: 2006image_489722a.jpg]
[Image: Recentimage_489741a.jpg]





US special forces used the airbase during the invasion of Afghanistan in 2001, but the Pakistani Government said in 2006 that the Americans had left. Both sides have since denied repeatedly that Washington has used, or is using, Pakistani bases to launch drones. Pakistan has also demanded that the US cease drone attacks on its tribal area, which have increased over the last year, allegedly killing several “high-value” targets as well as many civilians.
The Google Earth image now suggests that the US began launching Predators from Shamsi — built by Arab sheiks for falconry trips — at least three years ago.
The advantage of Shamsi is that it provides a discreet launchpad within minutes of Quetta — a known Taleban staging post — as well as Taleban infiltration routes into Afghanistan and potential militant targets farther afield.
Google Earth's current image of Shamsi — about 100 miles south of the Afghan border and 100 miles east of the Iranian one — undoubtedly shows the same airstrip as the image from 2006.
There are no visible drones, but it does show that several new buildings and other structures have been erected since 2006, including what appears to be a hangar large enough to fit three drones. Perimeter defences — apparently made from the same blast-proof barriers used at US and Nato bases in Afghanistan — have also been set up around the hangar.
A compound on the other side of the runway appears to have sufficient housing for several dozen people, as well as neatly tended lawns. Three military aviation experts shown the image said that the aircraft appeared to be MQ1 Predator unmanned aerial vehicles — the model used by the CIA to observe and strike militants on the Afghan border.
The MQ1 Predator carries two laser-guided Hellfire missiles, and can fly for up to 454 miles, at speed of up to 135mph, and at altitudes of up to 25,000ft, according to the US Air Force website www.af.mil
The News reported that the drones were Global Hawks — which are generally used only for reconnaissance, flying for up to 36 hours, at more than 400mph and an altitude of up to 60,000ft. Damian Kemp, an aviation editor with Jane's Defence Weekly, said that the three drones in the image appeared to have wingspans of 48-50ft.
“The wingspan of an MQ1 Predator A model is 55ft. On this basis it is possible that these are Predator-As,” he said. “They are certainly not RQ-4A Global Hawks (which have a wingspan of 116ft 2in).”
Pakistan's only drones are Italian Galileo Falcos, which were delivered in 2007, according to a report in last month's Jane's World Air Forces.
A military spokesman at the US Embassy in Islamabad declined to comment on the images — or the revelations in The Times yesterday.
Major-General Athar Abbas, Pakistan's chief military spokesman, was not immediately available for comment. He admitted on Tuesday that US forces were using Shamsi, but only for logistics.
He also said that the Americans were using another air base in the city of Jacobabad for logistics and military operations. Pakistan gave the US permission to use Shamsi, Jacobabad and two other bases — Pasni and Dalbadin — for the invasion of Afghanistan in October 2001.
The image of the US drones at Shamsi highlights the extraordinary power — and potential security risks — of Google Earth.
Several governments have asked it to remove or blur images of sensitive locations such as military bases, nuclear reactors and government buildings. Some have also accused the company of helping terrorists, as in 2007, when its images of British military bases were found in the homes of Iraqi insurgents.
Last year India said that the militants who attacked Mumbai in November had used Google Earth to familiarise themselves with their targets. Google Street View, which offers ground-level, 360-degree views, also ran into controversy last year when the Pentagon asked it to remove some online images of military bases in America.



******************************************************************
Secrecy and denial as Pakistan lets CIA use airbase to strike militants

http://www.timesonline.co.uk/tol/news/wo...755490.ece

The CIA is secretly using an airbase in southern Pakistan to launch the Predator drones that observe and attack al-Qaeda and Taleban militants on the Pakistani side of the border with Afghanistan, a Times investigation has found.
The Pakistani and US governments have repeatedly denied that Washington is running military operations, covert or otherwise, on Pakistani territory — a hugely sensitive issue in the predominantly Muslim country.
The Pakistani Government has also repeatedly demanded that the US halt drone attacks on northern tribal areas that it says have caused hundreds of civilian casualties and fuelled anti-American sentiment.
But The Times has discovered that the CIA has been using the Shamsi airfield — originally built by Arab sheikhs for falconry expeditions in the southwestern province of Baluchistan — for at least a year. The strip, which is about 30 miles from the Afghan border, allows US forces to launch a Drone within minutes of receiving actionable intelligence as well as allowing them to attack targets further afield.




It was known that US special forces used Shamsi during the invasion of Afghanistan in 2001, but the Pakistani Government declared publicly in 2006 that the Americans had left it and two other airbases.
Key to the Times investigation is the unexplained delivery of 730,000 gallons of F34 aviation fuel to Shamsi. Details were found on the website of the Pentagon’s fuel procurement agency.
The Defence Energy Support Centre site shows that a civilian company, Nordic Camp Supply (NCS), was contracted to deliver the fuel, worth $3.2 million, from Pakistan Refineries near Karachi.
It also shows the fuel was delivered last year, when the United States escalated drone attacks on Pakistan’s lawless tribal areas, allegedly killing several top Taleban and al-Qaeda targets, but also many civilians.
A source at NCS, which is based in Denmark, confirmed that the company had been awarded the contract and had supplied the fuel to Shamsi, but declined to give further details.
A spokesman for the US embassy in Pakistan told The Times: “Shamsi is not the final destination.” However, he declined to elaborate and denied that the US was using it as a base.
“No. No. No. No. No. We unequivocally and emphatically can tell you that there is no basing of US troops in Pakistan,” he said. “There is no basing of US Air Force, Navy, Marines, Army, none, on the record and emphatically. I want that to be very clear. And that is the answer any way you want to put it. There is no base here, no troops billeted. We do not operate here.”
He said that he could not comment on CIA operations.

The CIA declined to comment, as did the Pentagon. But one senior Western source familiar with US operations in Pakistan and Afghanistan told The Times that the CIA “runs Predator flights routinely” from Shamsi.
“We can see the planes flying from the base,” said Safar Khan, a local journalist. “The area around the base is a high-security zone and no one is allowed there.”
He said that the outer perimeter of Shamsi was guarded by Pakistani military, but the airfield itself was under the control of American forces.
Shamsi lies in a sparsely populated area about 190 miles southwest of the city of Quetta, which US intelligence officials believe is used as a staging post by senior Taleban leaders, including Mullah Omar. It is also 100 miles south of the border with Afghanistan’s southern province of Helmand and about 100 miles east of the border with Iran.




That would put the Predators, which have a range of more than 2,000 miles and can fly for 29 hours, within reach of militants in Baluchistan, southern Afghanistan and in Pakistan’s northern tribal areas.
Paul Smyth, head of operational studies at the Royal United Services Institute, said that 730,000 gallons of F34, also known as JP8, was not enough to supply regular Hercules tanker flights but was sufficient to sustain drones or helicopters.
Other experts said that Shamsi’s airstrip was too short for most aircraft, but was big enough for Predators and ideally located as there were few civilians in the surrounding area to witness the drones coming and going.
Farhatullah Babar, a spokesman for the President of Pakistan, Asif Ali Zardari, said that he did not know anything about the airfield. HOwever, Major General Athar Abbas, the chief military spokesman, confirmed that US forces were using Shamsi. “The airfield is being used only for logistics,” he said, without elaborating.
He added that the Americans were also using another airbase near Jacobabad, 300 miles northeast of Karachi, for logistics and military operations.
Pakistan gave America permission to use Shamsi, Jacobabad and two other bases — Pasni and Dalbadin — for the invasion of Afghanistan in October 2001. US Marine Special Forces were based at Shamsi and, in January 2002, a US Marine KC130 tanker aircraft crashed close to its runway, killing seven Marines on board.
Jacobabad became the main US airbase until Bagram, near Kabul, was repaired, while Pasni, on the coast, was used for helicopters and Dalbadin as a refueling post for special forces’ helicopters. However, in December 2001, Pakistan began sharing Jacobabad and Pasni with US forces as India and Pakistan began massing troops on their border. In July 2006 the Pakistani Government declared that America was no longer using Shamsi, Pasni and Jacobabad, although they were at its disposal in an emergency.
The subject has become particularly sensitive in the past few weeks as President Obama has made it clear that he will continue the strikes while reviewing overall US strategy in the region.
The latest strike on Monday — the fourth since Mr Obama took office — killed 31 people in the tribal agency of Kurram, and another on Saturday killed 25 people in South Waziristan, according to Pakistani officials.
Shah Mehmood Qureshi, the Pakistani Foreign Minister, responded on Sunday by categorically denying that Pakistani bases were used for US drone attacks.
Aerial assault
— Armed predator unmanned aerial vehicles (UAVs) have been in use since 1999
— The aircraft is controlled from the ground using satellite systems and onboard cameras
— The MQ9 craft, which is used in Afghanistan, is 11m long, has a 20m wing span and a cruise speed of up to 230mph. Each can carry four Hellfire missiles and two bombs
— Three systems were bought by the RAF last year for £500m
Sources: Jane’s Information, US Airforce, RAF, Times archives
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Homeland Security a Bureaucratic Behemoth After 8 Years Bernice Moore 0 2,373 02-03-2011, 03:05 PM
Last Post: Bernice Moore
  From Vietnam's Pheonix Program to Homeland Security....... Peter Lemkin 5 6,629 11-01-2011, 08:05 PM
Last Post: Jan Klimkowski
  New "Homeland Security" Toys Lower Boom on Privacy, Grease Usual Palms Austin Kelley 0 3,361 18-06-2010, 03:55 AM
Last Post: Austin Kelley

Forum Jump:


Users browsing this thread: 1 Guest(s)