Category: Current Events


War with the Newts

No, not the one by Karel Čapek, but the one going on in the Republican primaries.

As the weeks of dueling sound bites and one-upmanship ground on, there were some comical moments.   Whether it was Romney worrying about pink slips, Gingrich’s excessive patriotism that led to his slips in his marital vows, or Santorum’s slip that overloading the planet with greenhouse gasses is good for the plants.

But, sometimes, a candidate gets on the phone with NPR and thinks he can bluff his way through the interview, dodging questions and the like.

Enter Newt Gingrich earlier this week.

Inskeep asked Gingrich if he thought he should pull out since the former Speaker offered just that advice back in 1996 to those running behind the front runner.  Newt’s response?  Well he has no incentive to get out because he “cares about the future of the party.”  Apparently, those in 1996 didn’t care enough.

Then Inskeep asked Newt if his criticisms of the President and global oil prices were valid. Gingrich says of course they’re valid because of simple supply and demand–drill more domestically (and don’t “bait Saudi Arabia to pump more”) and the prices will drop.  Inskeep countered with the facts that under Obama, domestic production has gone up and we import less foreign oil than we have in a decade (the opposite trends under the previous administration), and yet the prices are not dropping.  Newt counters that all that new supply is due to private enterprises, not Obama, and that the U.S. could be the world’s leading oil producer.  Never mind that 1) supply is supply, and where it comes from would be irrelevant to driving down global prices (if that’s all it took), and 2) the U.S. simply is not sitting on enough oil to become the largest producer.  Both are factually wrong and show how fallacious his argument about gas prices is.

Some of Gingrich’s other statements also reveal what is so wrong with the direction of our politics.  Following up on the gas prices question, Inskeep asks the former Speaker if the price of oil (and thus gas) isn’t really related to the tensions and conflicts in the Middle East.  Newt responds: ” But there wouldn’t have to be tensions in the Middle East if we were deliberately producing enough oil that we didn’t care about the Middle East.”  So we really are fighting for oil over there, not freedom, democracy, or any other high-brow motivation.  Unfortunately, I’m quite sure that this is mostly true.

Earlier, citing Gingrich’s only hope as preventing Romney from getting a majority, Newt’s response was: “And it’s also clear that Governor Romney has so much money that he can grind his way towards the nomination, despite all that.”  So, in other words, elections aren’t really about ideas, character, or who is the best candidate.  They’re mostly about who has enough money to grind his opponents into dust.  Again, that is mostly true and why we’re in desperate need of campaign finance reform.

But, of course, Newt’s spurious logic is not confined to phone calls to NPR.

Also earlier this week, Robert De Niro–that overrated actor–made a joke about the country not being ready for a white first lady should one of the Republicans win.  Newt demanded that President Obama apologize for De Niro because it was said at a fundraiser for the President.

Bawhaaaaa?!?!

Newt went on to clarify that if the “left want to talk about talk show hosts,” then “everyone in the country ought to hold the President accountable” when someone at his event says something.

Bawhaaaa?!?!?!

So in Newt’s “mind,” a person on a national forum saying something degrading about a woman is as culpable as a fundraising host because the latter’s guest said something obviously satirical.  Yup, that almost makes sense…in a Čapek novel!

Of course, Newt doesn’t have the monopoly on spurious logic or even living as if he’s in a fictional world.

Take Rick Santorum.

Apparently the “defender of Homeland Security via denying gays the right to marriage” thinks Obama should not allow his daughter to go to Mexico for spring break because the State Department issued warnings for other parts of the country.  Santorum goes on to say that the President “is not above the law.”  Which is a fair point.  Except of course, that a State Department warning is not a law, especially when the warning is not relevant to that part of the country being visited.

But hey, why let facts get in the way of rhetoric?  I mean, voters do fall for this nonsense all the time.

Sorry for the cynical week-ender, but sometimes our broken political system just proves to be too ludicrous for me to ignore.

I consider myself a reasonable rational person, and I’d like to think I have an open mind about most things.  So when opposing views are discussed, I am quite happy to hear what the “other” side has to say and then, if I have the knowledge and inclination to do so, rebut their claims with evidence, facts, and context as needed.

Apparently, this is not the case for some other folks.  Some people think that presenting their side of an argument is sufficient to “win” the debate (in their mind), regardless of how spurious their logic is, how fallacious their arguments are, and how out-of-context their evidence might be.

Case in point:

I came across this gem on Facebook and, after some very easy fact-checking, it became obvious what a joke the implication of this Post-It note was.  (The premises being: a) gas prices were low when President Obama took office; b) gas prices are high now; therefore 1) it’s President Obama’s fault and 2) you shouldn’t vote for him because of this).

First, some context:

A) Prices of oil and gas were indeed low in January of 2009 after having grown steadily since 2001 (when George W. Bush became president), though they actually declined a bit and then hit a plateau until 2003.

Remember what happened in 2003?  That’s right, the Bush administration pushed for war in Iraq based on faulty and cherry-picked intelligence, while continuing operations in Afghanistan.  Oil (and thus gas) prices continued to increase until July of 2008, right before the U.S. stock market crash.  (There is ample evidence that speculation on the price of oil led to a bubble of sorts which facilitated and contributed to the crash.  In other words, financiers were looking to continue making money in oil on the backs of thousands of dead soldiers and civilians.)

So the market crashes, sending everything plummeting (including the price of oil), just before the inauguration of President Obama (who inherited the recession which had already started before his term).

B) Since then, as the market has slowly recovered (and oil demand returned), the price of oil (and gas) has gone back up.  Nothing really surprising in that now, is there?

See? Context matters.

As for the conclusions asserted by Mr. Post-It:

1) First, no President has any direct control over the price of gas (which is derived from the price of crude).  Why? It’s a global commodity, largely influenced by OPEC, a conglomerate made up of countries in the region where we’ve been waging war for 10 years.  If they want more money for their oil, they withhold production of crude and the price goes up (they sit on 40% of global oil production and most of the world’s reserves).

And, as you may know, the U.S. doesn’t even get most of our oil from the Middle East; our top four imports are from Canada, Saudi Arabia, Mexico, and Venezuela.  But, since it’s a global market for oil, we don’t catch a price break from our neighbors because they, like any other smart business entity, are telling us to “show them the money.”  Getting most of our oil from outside OPEC is more of a political move than an economic one.

Well, I guess this guy would.

Certainly the U.S. president (and his administration) can indirectly affect the price of oil (and gas).  For example, say the U.S. unilaterally invades a nation against the better judgment of the international community.  And say that invasion is based on false premises sold to a panicked citizenry.  And let’s say tax cuts are offered as a vapid olive branch to the citizens which consequently removes more money from the nation’s coffers to actually pay for the war.  Sure, those things could have a detrimental effect on the price of oil (indirectly through antagonizing OPEC) and directly on our economy.  But I mean, who would do such a thing, right?

2) So is any of the offered premises and conclusions a reason to not vote for President Obama?  I’ll leave that to you to decide, now that you have all the relevant facts and context.  But let me leave you with a few other changes that have occurred since January 2009.

* Gays and lesbians were not allowed to serve openly in the military, but now they can.

* Struggling young people can stay on their parent’s health plans longer until they get on their feet.

* Seniors pay less for prescriptions thanks to the closing of a loop hole in Medicare.

* He issued the kill order to end Osama bin Laden’s threat to the United States.

* We are no longer at war in Iraq.

* Workers with student loan debt (which has surpassed credit card debt in this country) can now have that debt forgiven after 10 years of making payments and working in public service, a traditionally low-paying and unselfish career choice.  If you pay for 20 years in any other field, the loan will also be forgiven (because paying 35 years for a 4 year education is ridiculous).

I would say that the above “changes” certainly provided “hope” for all those involved (FYI, that’s like millions of citizens).

So for all you folks out there who think you’re presenting your side of the story and hoping for an objective analysis of the situation, that’s not how “objectivity” works.

Objectivity requires full disclosure of the facts in order for them to be, you know, objective.  Leaving out information provides a set of subjective premises and leads to a subjectively biased observation (and consequently skewed conclusion).  Of course, if you don’t actually want to have an open and honest dialogue about something, then presenting only one side of an issue (and doggedly refusing to see the other) will certainly reinforce your own narrow world-view (and certainly, this is much easier to do anonymously or, you know, on the Internet).

But it won’t win you any real victories in a debate.

Devoured by our Consumption

As I’ve wasted countless hours on social networking sites, television, and once had a pretty crummy diet (apparently), I got to thinking about our consumption habits and the relationship to the products we’re supposedly consuming.

Supersized!

Our health is being devoured by the consequences of our diet.  We eat overly-processed food high in fat, artificial flavoring, and lord-knows-what-else, which have resulted in epidemics of obesity, diabetes, high blood-pressure, and cholesterol.  More health problems also drive up our overall health care costs which, unless you have amazing insurance, eat away at one’s pay checks or savings to pay for deductibles and other non-covered costs.

Our economy is also being destroyed in part by our need for manufactured goods which are no longer made in the USA due to outsourcing (to keep profits from consumerism high).  Without manufacturing to balance our export/import ratio, we fall further behind economically by sending money to other countries but bringing less and less in from foreign buyers.

Financially, we’re chronically in debt.  Credit card debt (to fund our consumerism) devours our livelihood through high interest rates or even the inability to pay at all.  As wealth inequality has grown over the past several decades, we willfully took advantage of easy credit to participate in an illusion of prosperity.

The financial sector’s greed for more and more money led to shakier and seedier investment deals that ultimately threw us into a full-blow recession that ate $12 TRILLION of taxpayers’ wealth.  (Were the financiers held accountable? Hell no—many were even rewarded! But I digress…)

Further, school loan debt now exceeds credit card debt in the U.S. and was acquired in pursuing an education because every employer wants a college degree (despite falling standards in the U.S. educational system).

Information is a bit more complicated in some ways.  There’s a lot more information accessible in the digital age, so much in fact, that we seek ways to limit our exposure to avoid being overwhelmed by it all (or allow our search engines to limit it for us—without our consent).  We still surf the Web and spend many hours on social networking sites, texting, and other virtual activities.  This can come at the expense of actual interaction with real human beings; my favorite example is when I witnessed the family of four out to dinner all clacking away on their portable media devices rather than talking with each other (I wonder if they were texting each other?).  We expect instantaneous communication in all things thanks to the digital revolution, and this erodes our understanding of how real-life interaction works.

Our attention spans seem to have been cut down to a Tweet or less—anything longer seems interminable for some.  It’s even affecting how we watch movies; I was at a documentary recently about the financial collapse, and many in the audience (a more “experienced” generation) thought it was too long (at only 120 minutes to describe a complicated series of events stretching over 30 years).

We seem to want our information summarized (the sound-bite phenomenon), and this seems to come at the expense of ability to critically analyze those statements for validity, logic, or even rationality.

I’m not some naysayer of consumerism in an absolute sense; I just think we need a dose of moderation in how we go about it and perhaps put some thought into the long-term consequences of our actions.

Now if you’ll excuse me, I have to get back to expanding my virtual manor that I can’t afford in real life.

With the news abuzz with events that, while very tragic for some, continuing events can be easily overshadowed.  Here’s a quick round up of things that affect most, if not all, of us (on the planet).

The War in Afghanistan– Now the longest war in US history, our involvement in this militarily-infamous region has cost close to half a trillion dollars (more than half of Iraq which is also still accruing costs), over 1,800 US military fatalities, and over 32,000 wounded soldiers.

The US Debt– Related to the above item, we’re still being embarrassed by a stymied Congress incapable of rising above petty (and irrational) politics to solve our debt crisis.  We, the taxpayers, are being held hostage by ludicrous sound bites and, on the part of many, a willful ignorance of what to do in order to fix this massive problem.  Oh yeah, and the financiers who really facilitated the great bulk of this problem (thorough the economic recession), have still not been indicted for a single thing.

The European Debt– Another crisis that is and will continue to have inevitable impacts on our own economic well-being, more states in the Euro zone are close to failing and looking for a bailout; but where the “saviors” are going to get the money is anyone’s guess.  (To help with all of these woes, a miniscule “Robin Hood” tax on the world’s banking transactions has been proposed; within a day, suspicious “ballot stuffing” began pouring in from…wait for it…Goldman Sachs.  Shocker, eh?  Remember, their betting on the failing of the US housing market is how they made lots of money—and they’re advising their clients to bet on failure again).

Chinese/US Military Relations– Defense Secretary Panetta has remarked on the US remaining a “Pacific power” (i.e. relative to China).  Naval power is very expensive to maintain, yet the military and advisors to Congress both advocate that the US needs an increased military presence in Asia.  Not unrelated, this presence is also for competitiveness in economic matters (and in response to China’s military establishment growing bolder).

Let’s keep some things in perspective, eh?

In this second part of my continuing rant on technology, I take a look at the sharing of our personal information, who wants to peek at that data, and how our privacy is compromised.

Panopticon vs. Exhibitionism

I probably don’t need to inform you how various forms of surveillance having been popping up around us for quite some time.  From cameras at intersections to ATMs, from our SSNs to our ISPs, and from satellite imaging to the Patriot Act, the powers that be can readily identify most law-abiding citizens and their actions with regular accuracy (and sometimes even those who are not so law-abiding).  And they’re constantly adding more tools to their arsenal.

Facial recognition technology is evolving quickly, with demands from the military and government to improve the ability to identify subjects in the non-frontal and non-static images usually caught on surveillance cameras.  Plenty of work is being done by scientists to address this issue, and, while it is certainly a worthy cause to nab criminals, terrorists, and other undesirables of the hour, most technology developed for the military/government eventually makes its way into our daily lives.

I recently renewed my license at the DMV, and “for my own protection,” I digitally scanned my fingerprints into the database.  Facial recognition and digital fingerprinting has also made its way into Pizza Hut and KFC to keep a strict eye on their employees—I mean for security purposes.  I even saw the cashier at my local Wendy’s have to scan her thumb to work the register.

Ostensibly these measures in the workplace are to eliminate cheating and slacking in these high-security jobs.  But I think it further dehumanizes the workers as a not-so-subtle-side effect of this new technology.  This software literally reduces us to our constituent parts in order to identify us (our eyes and fingerprints are unique to us and are both simply a physical characteristic).  We truly become a simple cog in the greater machine of corporations when we walk this road.  Much like using the right tool for the job (say a square peg for a square hole on the assembly line), the tech ensures the right object (the worker) is what they’re supposed to be and in the place they’re supposed to be (the cashier slotted in at the register).

There was this old philosopher Jeremy Bentham who envisioned an ideal prison called the Panopticon.  It was basically a circular prison that gave the inmates the impression that they were constantly being watched, even if they weren’t.  The idea is that humans tend to behave if they think the authorities are watching and could suffer consequence for not abiding by the rules/laws those authorities set forth (everyone slows down when passing a cop on the road).

I’m not saying that the government is building a Panopticon per se, but one could see the constant state of surveillance acting as a de facto means to do just that, albeit under the guise of national/homeland security.  (Take a look at the government-proposed “Data Eye in the Sky” that automatically collects information from the internet, cell phones, and so forth.  According to one excited researcher: “If I have hourly information about your location, with about 93 percent accuracy I can predict where you are going to be an hour or a day later.”).

Of course, it’s not just a blogger’s incarnate Big Brother we ought to be concerned with; we do a great job of voluntarily policing ourselves.  With ubiquitous video/photographic cameras built into our cell/smart phones, YouTube, Flickr, and Facebook are a veritable treasure trove of human behavior in its less savory moments.  These can range from simply embarrassing to evidence of actual criminal conduct, and it’s all posted in the public domain.  We often decry stringent government surveillance (if we know about it), but we often encourage or invite our moments of poor decision-making to be posted on social network sites in an incredible display of willful hypocrisy (or narcissism).

Even if we don’t want our social networking information to be readily available, Facebook’s constantly changing formats, privacy preferences (and terms) makes it quite difficult for the average user to stay on top of keeping their personal information locked down.

Sure, many of the things we use are voluntary (Facebook, Google, apps for our imachines), but when these things become so commonplace in everyday use, one must come ever closer to emulating the Mennonites to not be involved with the “great wireless grab-bag of info” floating around out there.

So what to do?  I’ve heard a couple of sides to the privacy arguments.  On one hand, if you haven’t done anything wrong, you don’t have anything to hide, so what does it matter if the government/authorities have access to your personal information?  On the other hand, if we continue down this path with our civil liberties and freedoms slowly nibbled away at the edges, we’ll soon be living in 1984.  Mostly, I think that it’s important for people to know what is happening with their personal information, surveillance measures, and care about it (beyond their SSNs and browser history).  In this era where information is power, if we can tear ourselves away from vapid distractions (noise) and focus on the signal, we’ll all be better off.

Next time I want to take a look at cyberwarfare, hackers, and the goverment’s role in protecting our information infrastructure.

This will be the first of a three-part series on technology—mostly about the current challenges we’re facing with a bit of extrapolation and prediction.

The Power of Technology

Computing power extrapolation

Ever since our ancestors began making tools with stones, we’ve sought to increase our level of technology—and the mantra of the science field has always been “can” we do something, not “should” we pursue such advancements. Since WWII, advancements in computing power has increased at an exponential rate (doubling every 18 months or so), and other fields are gaining ground as well. With strides in organic computing and nanotechnology, there seems little impediment of this accelerating process and what this means for the human race should closely be examined.

Let’s take a quick look at what organic computing actually is. Basically, it’s the field of science that is trying to create adaptive and self-organizing computer systems. Yes, we are currently trying to engineer the AI bogey man of countless “science fiction” stories (the one where machines gain sentience, replicate, and exterminate mankind). One major player in this field is the German Science Foundation that researches topics of adaptivity, re-configurability, the emergence of new properties, and self-organization—a list of characteristics that any cautious person would want to limit in computers/machines.

Back in 2004, Israeli scientists used a computer made of DNA that can eventually be used to diagnose and cure a medical condition by identifying cells of a certain type and then releasing medicine to treat these sick cells automatically. It doesn’t take many more steps along that logical path to see a horrifying version of this computer that is weaponized to attack healthy cells of any particular type and then eradicate them (say one’s liver cells or white blood cells). Add this to the abilities of these computers to adapt and reconfigure themselves (e.g. spread from one host to another), spontaneously create new properties (e.g. make itself deadlier), and self-organize (e.g. make itself more efficient and replicate itself), and we’re well on our way to also engineering an extinction-level plague that will be able to eradicate our species. Fortunately, this computer was the simplest of “machines,” and could barely be called a proper computer by those in the field, so perhaps my fears are decades down the road, right?

Not so much. In 2010, researchers from Japan and Michigan “have succeeded in building a molecular computer that, more than any previous project of its kind, can replicate the inner mechanisms of the human brain, repairing itself and mimicking the massive parallelism that allows our brains to process information like no silicon-based computer can.” In other words, we’ve begun creating computers on a molecular level that can mimic the human brain without the size limitations of our cranium. Imagine what a room-sized computer composed of these molecular components combined with the abilities to self-regenerate–and all of the other characteristics mentioned above–will be able to do.

Nanobot

Or imagine this technology as the functioning brains of nanobots which are tiny robots able to swim through our bloodstream and attack cells they’ve been programmed to destroy. Sound like more science fiction? It’s not: last year (2010), they’ve already successfully deployed these machines in cancer patients. Mark Davis, head of the research team at the California Institute of Technology recites his team’s success: “It sneaks in, evades the immune system, delivers the siRNA [small interfering RNA], and the disassembled components exit out.” Yeah, nothing could possibly go wrong with a programmed machine designed to evade our immune system and insert genetic code into cells to disrupt their functioning, right?

Call me an over-reactive doomsayer if you want, but the combination of advances in these fields is scaring the crap out of me (and not just because I read a lot of sci fi).

The Singularity

The birth of AI

Apparently this convergence of computing power, nanotechnology, and genetics is already being studied by people much smarter than me, and they call the moment machines gain sentience (AI) “the Singularity.”  It may sound like science fiction to a lot of people, but if you look at some of the most influential science fiction literature and what has already occurred, you might not be so quick to dismiss the Singularity on such grounds.

If you do take the idea seriously and think about the consequences of this moment, one will come to the conclusion that the human race/era will be irrevocably changed forever. What exactly those changes will look like is up for debate, but it will be fundamentally different from the current age.

Time had a great article about the Singularity and one of its main researchers Raymond Kurzweil. The same people who are involved in the Singularity field are also looking into life extension, especially through the melding of man and machine. They predict that this could occur in various ways, whether it’s uploading our consciousness into a software program or nanobot repairing the age-damage to our bodies. The piece had a sobering section on how man and machine are already becoming one and functioning side-by-side: “Already 30,000 patients with Parkinson’s disease have neural implants. Google is experimenting with computers that can drive cars. There are more than 2,000 robots fighting in Afghanistan alongside the human troops.” And don’t forget, in our most recent competition (including syntax and actual understanding), the machine Watson trounced our best knowledge-champions on Jeopardy.

Kurzweil puts his date of the Singularity at a conservative 2045 based on the continuing exponential growth of various technological factors.  That’s well within my life time, and certainly within my daughter’s.  Of course, I’m not saying we should all panic due to this gospel-according-to-Kurzweil, but rather that it is worthy of serious thought–and if we do believe this will happen, it deserves appropriate preparation as well.

In the next post I’ll take a look at the other applications of technology (social media and surveillance).

Addendum: 10/19/11: Apparently the government is trying to build an “Eye in the Sky“–a computer system that captures all of the free-flowing data from the internet, cell phones, etc. in order to predict movements of mass-human behavior (political revolutions, economic recessions, pandemics, etc.).  You know, kind of like a big Net in the Sky to capture information about human behavior–what could possibly go wrong?

Addendum: 11/14/11:  Now, the field of science is actually helping robots control human bodies–literally.

Addendum: 3/9/12: Oh, and the navy now has grenade-throwing robots that “fight fires.”  You know, like those pacifist robots in Robopocalypse.

I just finished the book Reviving Ophelia: Saving the Selves of Adolescent Girls by Mary Pipher.  Although a bit dated, it scared the crap out of me as a parent of a 2-year old daughter, and I’ll Pour My Heart Out here for a minute.

Yes, you, too can look like a celeb--with a crew of professionals and a healthy dose of Photoshop.

Pipher examines the enormous and contradictory tensions that adolescent girls face growing up in American culture (of the 1990s).  It seems to me that most of the trends she enumerates have only gotten more pronounced in the first decade of the new millennium.  Tensions like: be smart, but not too smart that you might threatened others (especially boys).  Be nice and fit in at the expense of your individuality.  They’re bombarded with ridiculous images of the unobtainable “ideal” woman by marketers and the media.  She argues that all of these conflicting messages undermine girls’ self-confidence and their attempts to discover their “true” selves at the very time they should be striving to do just that.  Worse yet, just at the time they could use help navigating the stormy seas of media and peer saturation, they’re told to pull away from their parents to become a true, independent adult.  And apparently it doesn’t matter much what type of parenting is involved (low-to-high affection and low-to-high control); though girls with parents high on both seem to do the “best.”

I found myself not only saddened by the struggles my daughter will inevitably go through, but even angrier at our culture than I normally am.  Why is it that even in the social sciences, there seems to be a sense that girls should adapt to the patriarchal culture?  Where are the books and theories that advocate bringing up boys so they don’t turn out to be such misogynistic d-bags? Where’s the talk about just raising your kids to treat others as humans trying to make it in the same tough world?  If it’s a realignment of our cultural values that’s needed, I say we better get on with it!

My one gripe about the book was Pipher’s comparisons of the pressures girls face to those that boys must deal with.  First she admits that she hasn’t worked with boys so she won’t comment on their challenges, but then she uses boys as the standard against which she measures the double standards that girls are put through.  And that part, if she would have left it at that, would have been fine.  But Pipher seems to imply that girls’ feelings and psyches are much more complicated then boys’, as are the challenges they face.  I’d argue that this is not only a misconception, but such a misconception is why we have trouble raising self-aware and emotionally-intelligent boys in the first place.  There are plenty of challenges and double standards that boys face, and they can be just as insidious as the ones girls have to overcome.  Indeed, if we acknowledged boys’ emotional complexity (and more importantly if we encouraged them to deal with those emotions rather than suppress those feelings that weren’t “manly”), I think society would be better off.  Instead, some of us don’t get around to that until after college (if at all).  That’s a bit late if you ask me.

In any case, I hope that we can raise our daughter to take any bullying or name-calling with a grain of salt, and hope that we can preserve our close relationship.  Of course I realize this is exactly most parents’ hopes that are often dashed upon the rocky shores of adolescence, but that doesn’t mean I have to give in to those pesky societal pressures!

This should not be our daughters' role model.

And this is not who our males should be emulating.

So Facebook is changing…again [wait for gasps and astonishment to subside].

I’m not against change as a rule, but I am annoyed with constant change in a user interface (how you click around a site) without informed user studies (asking us what we want to change or keep the same).  But all that aside, I have bigger concerns about Facebook’s new “Timeline” and developer application changes.

First, the Timeline is a “new way to express yourself” according to Zuckerberg (that young fella that invented Facebook, more or less).  It’s kind of like Facebook meets Twitter with pictures.  Or what I like to call, another avenue to feed our narcissism, but I digress.  I hope this is a voluntary option as opposed to an “opt-out” program (like so many of Facebook’s changes).  Indeed, with the last change Facebook made (“top stories”), the algorithms and coding decided for us what a top story is, and puts those at the top of our news feeds that we read about our friends. Kind of like Google’s search algorithms that filter our search results based on a variety of demographic factors.  In other words, the high-end and techies of high-use programs are writing programs that limit what we see without our consent.  That’s called censorship people, and it’s being done right under our noses.

Further, the Timeline goes back as far as you’ve been on Facebook (presumably), and puts all of your actions/updates/etc. in one place.  You know, so someone can look and discover a whole lot about you in one glance instead of taking the usual, longer methods of stalking, um, I mean surveillance…wait, I mean perfectly innocent exploration about their “friend.” Sure, it’s up to us (apparently) to edit this stream of visual faux pas, but we all act responsibly when it comes to posting our lives on the internet, right?

Secondly, Zuckerberg announced that you can use applications within Facebook to virtually join your friends to listen to music or watch shows.  You know, in case actually getting together for a real social event is just too much of a bother.  Plus, all those new developers and companies can now more easily download all that info about you to bombard you with more “targeted” ads, because that’s just what we all need—more opportunities to rebuff those marketers attempting to convince us to buy products we don’t actually need.  I wonder how much money the marketers paid Zuckerberg to allow them to tap into his audience of 800+ million users?

Maybe it is time for me and like-minded people to move over to Google+.  I’m sure we have a few years before massive exploitation on that site…

On this 10th anniversary of Al-Qaeda’s successful attack on New York and the Pentagon, there are many heart-felt and important testimonials, tributes, and memorials being offered by various groups and people. Most of these involve remembering the victims, profiling the survivors, and exploring the meaning of “9/11” as the event has come to be known in American culture.  I would like to point to the historical roots of the attack as a learning opportunity for some of the events occurring in the Middle East.

Back in the late 1970’s and through most of the 1980’s, there was a war in Afghanistan between the Soviet Union (backing a newly formed Afghan government) and the insurgent mujahideen (freedom fighters against the godless Communists).  This was, of course, during the Cold War, so the US got involved by sending forces over to train the mujahideen on Pakistani soil with money from the Saudis (like the bin Ladens).  Osama learned guerilla tactics during such training sessions.

Also during this time, there was an Iranian Revolution which replaced the pro-American Shah with a previously exiled Ayatollah Khomeini.  Again, the US decided to get involved and encouraged Saddam Hussein to invade Iran, and Reagan sent Donald Rumsfeld as emissary to the Iraqi leader, eventually providing arms and other support (including biological weapons).

Both of these wars drained the combatants significantly in monetary and human resources.  When both wars finally ended (in 1989 and 1988 respectively), nothing had really changed in Iran or Afghanistan, other than large-scale destruction.  The US pulled out of the post-conflict, and especially in Afghanistan, the mujahideen were left to fend for themselves.  The “freedom fighters” were not well-received in their home countries and eventually evolved into bands of stateless warriors, hardened both physically and spiritually by their battles.

In 1990, bankrupt and under pressures from his creditors, Saddam invaded Kuwait under various pretenses.  Again, the US interceded, this time alongside a coalition with outright military forces (Secretary of Defense Dick Cheney made several trips to meet with the Saudi king during the buildup of support for military action).  As the US-led forces of Operation Desert Shield were based in Saudi Arabia, the mujahideen were outraged that an Islamic government would let traitorous infidels use the holy land to wage war against other Muslim faithfuls.  Toward the end of the war, the US encouraged an Iraqi-led uprising, and Kurdish forces in the north began to fight, hoping to trigger a coup d’état.  When US support failed to materialize, Saddam’s troops crushed the rebellion and drove many of the Kurds out and into Turkey and Iran.  A month later, US troops began to withdraw, having failed to start a successful coup d’état and not willing to pay the political and human costs to topple Saddam.

Over the next 10 years, Osama bin Laden probed US intelligence capabilities and reactions with a variety of attacks on the embassies in Kenya and Tanzania, as well as on the USS Cole.  He effectively mapped our strengths and weakness (with help from his sources in the Pakistani and Saudi Arabian intelligence services), overloaded our intelligence system, and shored up the security of his own organization (by checking for leaks in Al Qaeda), which ultimately led to the successful attacks on September 11th, 2001.

These attacks in turn led the US to invade Afghanistan looking for Osama bin Laden.  Shortly after, we also invaded Iraq under certain pretenses, eventually toppling Saddam.  Both of these wars have cost us billions of dollars, tens of thousands of casualties (that’s wounded and killed), and facilitated our current recession.

So why the history lesson?  Because there are several things going on that ought to be critically looked at with a historical eye.  There was much talk about an “Arab Spring” which were a series of demonstrations and revolutions in the “Arab world.”  Only one revolution (Tunisia), finally saw a change in regime and was led and completed solely by its own people.  In Egypt, although President Mubarak stepped down, he ceded all power to the military, which is still in the “process” of instituting reforms alongside the new president Essam Sharaf.  In Libya (where NATO forces have provided direct military aid against Gaddaffi’s troops), the terrorist/not-a-terrorist/illegitimate leader’s (1970s-2003/2004-2010/2011) regime has just collapsed, and now the real test of change will begin.

Protests in Syria are still being met with brutal crackdowns.  Yemeni and other protests are still ongoing, while other countries (Algeria, Jordan, Morocco, Oman, Saudi Arabia, Kuwait, Lebanon, Sudan, etc.) have seen their protests fizzle out with no change.

The point, it seems to me, is for the US (and the West in general) to tread carefully in these volatile situations.  Nor should we be so short-sighted to think that our actions (or lack thereof) won’t have consequences that can manifest themselves over the next 20 to 30 years.

Otherwise, the pain we’ve endured and the lessons we should have learned from 9/11 will have been in vain—and that would be another tragedy itself.

Note: Most of this information comes from George Friedman’s book America’s Secret War (2004).

I think sometime last week, (my sense of the time-space continuum is all FUBAR due to teething-induced sleep deprivation), I came across a great article written in 1994 about cyberspace.  It’s a bit long and potentially dense, but I highly recommend it.  The author (humdog) accurately observed how humans and corporations interact in the “electronic community,” though she could not foresee how far we would take those interactions down the rabbit hole.  I’d like to address some of her points below and compare them to what is happening today.

“Cyberspace…is a black hole; it absorbs energy and personality and then re-presents it as spectacle.”

Anyone with a Facebook account can attest to this; in 2010, it became the most visited URL in the world.  Heck, anything (“social” networking) that surpasses porn for internet usage ought to bear closer scrutiny.  Although there are some FB users out there that share way too much, most of us present a very specific persona to the rest of the online community, something we do indeed spend a lot of energy on cultivating.  Yet this “spectacle” that we present is subject to ridicule, bullying, and even short-lived fame.  I wonder if we spent more time developing our actual interpersonal relationships, where we would be?

“we prefer simulation (simulacra) to reality.  image and simulacra exert tremendous power upon our culture.  almost every discussion in cyberspace, about cyberspace, boils down to some sort of debate about Truth-In-Packaging.”

Again, the facades we create for our social networking sites are our “preferred” (dare I say “idealized”) versions of ourselves (in many cases).  Whether it’s for job-seeking on LinkedIn, mate-seeking on eharmony, or our alter ego in Second

This is probably some guy named Otis living in his parents' basement.

Life, we have taken humdog’s idea about simulacra and multiplied it tenfold.  I don’t know if this is some mass-psychological epidemic of multiple-personality disorder or merely a desperate desire for us to live our lives as someone else that stems from dissatisfaction in our daily lives.  Or maybe it’s none of that.  But somewhere along the line, we could very well lose sight of our true selves, or at least do things for our fake personas that we wouldn’t normally do (hopefully).

Of course, with the explosion of the Internet, savvy users are always on the lookout for scams, phishing attempts, and other assorted false sirens meant to lure the unsuspecting.  From ads to photos, one of our first questions is: “is that Photoshopped?”  In other words, the process of fakery has been turned into a verb using its most popular tool.  And apparently it only really bothers people if it’s involved with selling beauty products that might give folks a false sense of reasonable outcomes should they use their products—and even then, only if Photoshop has been used “too much.”  Our sense of reality is being distorted gradually and insidiously, and it does manifest itself in the “real” world.  As science pushes the frontiers of AI and robotics, I suppose it’s only a matter of time before mankind falls to some sort of robo-pocalypse that was previously relegated to though exercises of extrapolation by science fiction writers.

“i have seen many people spill their guts on-line, and i did so myself until, at last, i began to see that i had commodified myself…i created my interior thoughts as a means of production for the corporation that owned the board i was posting to, and that commodity was being sold to other commodity/consumer entities as entertainment.  increasingly, consumption is micro-managed…”

Quietly but inevitably, simple chat conversations or searches are turned into a means of highly-specific advertising aimed at the user.  Whether it’s on Google or Facebook, we’re providing these companies with a means to efficiently streamline their advertising dollars by giving them a big old bull’s-eye on our virtual forehead.  Why should they spend their millions on advertising scattershot-style when we’ve lined up on the shooting range like those little ducks moving in a row?  Worse in some ways are how search engines’ algorithms are filtering for us, based on what they “think” we want to see (based on past searches and demographic factors).  In other words, they’re taking control out of the users’ hands and are effectively censoring what we see.

Beyond the ordinary conversations, it’s truly astounding what people will say (or exhibit themselves doing) online.  Perhaps the most outlandish things are often posted anonymously (which points to the trend of a lack of accountability for what we spew online).  Yet, I’ve seen embarrassing, vulgar, and hideous things posted under social networking accounts (assuming that the profile is real, which, sadly, I personally know to be the case in some instances).  Such outbursts provide their viewers with fodder for entertainment, and may even have been posted to produce some sort of shock-and-awe campaign of narcissistic warfare.  To me, it shows a lack of dignity and self-respect (or a pathological need for attention).

“the rhetoric in cyberspace in liberation-speak.  the reality is that cyberspace is an increasingly efficient tool of surveillance with which people have a voluntary relationship.”

Yeah, no one forces us to post the intimate details of our lives, yet we often do.  And we certainly know the problems of privacy/security that these sites have.  Yet we continue to share our personal information for some irrational reason (yes, I’m including myself).  Further, we no longer have to only watch out for Big Brother, but also for Little Brother since nearly everyone has a portable camera on their cell phone to capture anything going on in the street and post it via YouTube for the world to see.  We willingly sell-out our fellow man for that spectacle of entertainment (though it is sometimes warranted in cases of criminal conduct or keeping an eye, ironically, on Big Brother).  But then, Big Brother is still out there—government agencies are constantly trying to gain access to the terabytes of personal information that companies have about their users.  Heck, iPhone users are not only tracked with GPS, but their photos are taken without their consent (and presumably stored somewhere). I’m not saying this is a nefarious plot to track and record all of Apple’s users, but…

“so-called electronic communities encourage participation in fragmented, mostly silent, microgroups who are primarily engaged in dialogues of self-congratulation.  in other words, most people lurk; and the ones who post, are pleased with themselves.”

Yeah, I’m a blogger, so I fall into that latter category—I just hope I’m raising some awareness along the way!  But I think more broadly, the problem with this information-age is the self-filtering most of us do by going to those sites/groups/list-servs that reinforce our views.  Whether it’s AlterNet, FOX News, or some conspiratorial cabal site, we don’t regularly seek out the “other’s” views.  And all too often, our sites are about pointing out what’s wrong with the other side and why our ideology is the right one.  This doesn’t seem to pave the way to a reasonable, rational dialogue which we so sorely need (current debt crisis anybody?).  These tendencies are exasperated by other media outlets, but that was a post from another time.

To wrap up a fairly long post (if you’ve made it this far, thanks!), I think we need to remain vigilant and in a cycle of constant analysis of how our culture deals with technology, the division between reality and the digital (simulated) world, and how those tools are being used by various parties.  It’s not up to a few watchdog groups, but rather it is our responsibility as a collective (lest we be assimilated!).  I’d rather not prove any of the dystopian authors correct if we can help it.