From the outside in

Monday, October 31, 2011

Halloween, The Wiccan New Year Celbrate Samhain

via The Daily Dish | By Andrew Sullivan by Andrew Sullivan on 10/31/11

GT_WICCA_110131

Starhawk, a practioner of Wicca, explains the day's significance:

For Witches, for those who practice the renewal of the ancient, pre-Christian Goddess religions of Europe and the Middle East, Halloween is our most sacred holiday, our New Year. In Celtic Ireland, Wales and Scotland, Samhain, pronounced ‘sau-in’, was the time when the sheep and cattle were brought down from the summer fields, when the harvest was gathered in and the dark time of year began. The fruits of the harvest, the blessing of the year’s abundance, was shared with the ancestors in the form of offerings which have come down to us in modern times as the candy we give to children-who are the ancestors returning. Harvest is a time of ending, but also a time of beginning, for the Goddess stands for the great regenerative powers of nature. Out of darkness, light will be born anew. Out of the time of cold and dormancy, new life will return. Death is part of a cycle that brings about rebirth.

(Photo: Names of donors are carved into bricks at the Witch School October 25, 2006 in Hoopeston, Illinois. Wicca is a neo-Pagan religion which uses magic and nature in its teachings. The school, which opened in 2003, offers courses in Wicca theology, hosts seminars and Wiccan rituals at the campus. By Scott Olson/Getty Images.)

Posted via email from The New Word Order

‘Unsettled’ And ‘Controversial’ Climate Science, Revealed

via ThinkProgress by Brad Johnson on 10/31/11

Amid the constant din of conspiracy theories and outright lies about global warming pumped by the fossil-funded conservative media, an excellent story by E&E News reporter Paul Voosen explores the real controversies in climate science today. Profiling scientists at work from Mauna Loa to MIT, Voosen investigates scientists’ work to refine our knowledge of how the complex climate system is responding to the massive influx of man-made pollution. The decadal trend of accelerating global warming is well understood, but interannual variability and the exact rate of change is a matter of great debate, as oceanic circulation, solar variation, soot and sulfides, and other phenomena interact in ways that scientists are working furiously to measure with greater precision.

Posted via email from The New Word Order

5 Reasons Google+ Could Win the Social Enterprise Battle

via Mashable! by Balakrishna Narasimhan on 10/31/11


Balakrishna Narasimhan leads solution marketing for Appirio, a cloud solution provider that helps enterprises adopt, connect and extend cloud platforms such as salesforce.com, Google and Workday. Follow him on Twitter: @bnara75

Last Thursday, Google announced that Google+ will be available for Google Apps users. This means that the millions of people using Google Apps for their businesses will now have access to the Google+ social collaboration platform.

With Google+’s unique features for search, selective sharing and rich communication, it offers consumers a very different user experience than the established social networks. For individuals, Google+ has quickly become a great place to build your interest graph — that is, find the latest content and people related to topics you’re interested in.

SEE ALSO: Google+: The Complete Guide

With its seamless integration with Google Apps, Google+ promises a very different type of social enterprise experience. In fact, Google+ has five unique advantages over other social business platforms.


1. Smart Integration With Existing Google Apps


Google+ is fully integrated with Google Apps. As a user, you don’t need a new login — it’s just another tab like mail, calendar, docs or video. Most business users spend their day in mail or calendar, so a tool that’s easily accessible from the daily workflow has advantages over third-party software.

Thinking a bit ahead of where the product is, the possibilities that are opened up by the integration with Google Apps are pretty exciting. You can imagine “+1” buttons and rich collaboration across sites, docs, spreadsheets, presentations, blogs, videos, photos and more. Or imagine working within a doc and starting a hangout with collaborators while sharing your screen. For companies using Google Apps, taking advantage of these features would require no additional software, logins or changes in behavior.


2. Google+ Already Knows a Lot About You


Because of its tight integration with Google Apps, Google+ could take advantage of what it already knows about each business user, including whom they email, how often and how recently, as well as the topics they write about and search for. Google+ is in a position to help an enterprise user not only quickly build out his internal circles, but also discover those outside the company who are talking about the same topics or industry. If Google chooses to pursue this, it would make a great tool to help each user build out broad interest-based professional networks.


3. Google+ Is Uniquely Positioned to Help You Find and Share Interesting Content


Nobody has a better index of what’s on the web than Google. So nobody is better positioned to help you find interesting content and people from both inside and outside your company. Google+ Sparks let you follow the latest from the web on topics you’re interested in, and one can imagine something similar within your domain. Internal Sparks could let you quickly find content and experts within your company on work-related topics you’re most interested in.


4. Google+ Integrates Public and Private Sharing


Unlike other social enterprise platforms, which keep most shared content behind company walls, Google+ integrates public and private sharing. When I’m using Google+, I can decide for each post whether I want to share it with my colleagues, my clients, or certain subsets of either category. Also, because a number of websites have already embedded +1 buttons, it’s easy to “like” content from across the web and share it with targeted groups.


5. Android Phones Sync Easily With the Entire Apps Suite


Finally, an Android mobile phone brings this complete integration to users on the go. Activating Android handsets with your company’s Google Apps account brings all this productive and social functionality to the palm of your employees’ hands. And the wide variety of devices and carriers means greater flexibility.


A Video Explanation of Google+



The Google+ project: A quick look


Google provides an overview of the entire Google+ project.

Click here to view this gallery.

More About: Business, contributor, features, Google, Social Media


Posted via email from The New Word Order

Beagle Survives Gas Chamber, Seeks Loving Home #WIN #Hero

via Gawker by Seth Abramovitch on 10/30/11

On October 4th, a stray Beagle mix in an overcrowded animal shelter in Florence, Ala., was loaded into a gas chamber with 18 other dogs marked for death. 21-year-old Cody Berry, the worker assigned this awful task, locked the door, turned a key and pressed a button that releases carbon monoxide into the chamber. He returned later, The Star-Ledger reports, and heard something moving inside the chamber. More »

Posted via email from The New Word Order

Sunday, October 30, 2011

Koch-Fueled Study Finds Recent Warming “On the High End” and Speeding Up, as...

via ThinkProgress by Joe Romm on 10/30/11

We have learned two important things from the Berkeley Earth Surface Temperature Study (BEST):

  1. Denier claims that prior scientific analysis of the key land surface temperature data OVER-estimated the warming trend were not merely wrong, but the reverse was true.  Warming has been high and accelerating.
  2. The Deniers and Confusionists and their media allies can never be convinced by the facts and will twist themselves into pretzels to keep spreading disinformation.

We also learned that BEST’s Judith Curry still would rather be a confusionist than a scientist  — but that ain’t news (see “Judith Curry abandons science“). I digress.

data analysis graph

The decadal land-surface average temperature using a 10-year moving average of surface temperatures over land. Anomalies are relative to the Jan 1950 – December 1979 mean. The grey band indicates 95% statistical and spatial uncertainty interval.

Recall that the foundation of the phoney Climategate charge.   Somehow the climate scientists at the Climatatic Research Unit (CRU) at the University of East Anglia, led by Phil Jones, were manipulating the data and the peer review process as part of a grand conspiracy to convince the public the earth has been warming faster than it really is.  A key point is that “the CRU compiles the land component of the record and the Hadley Centre provides the marine component.”

The BEST team vindicated climate science — see Koch-Funded Berkeley Temperature Study Does “Confirm the Reality of Global Warming.” Equally important, if you read the key paper, they found:

we find that the global land mean temperature has increased by 0.911 ± 0.042 C since the 1950s….  our analysis suggests a degree of global land-surface warming during the anthropogenic era that is consistent with prior work (e.g. NOAA) but on the high end of the existing range of reconstruction.

D’oh!  The BEST data shows considerably higher warming in recent years than HadCRU (the red line above).

Of course, this isn’t news to anybody who actually follows this issue.  Two years ago, the Met Office released an analysis concluding that “The global temperature rise calculated by the Met Office’s HadCRUT record is at the lower end of likely warming.”

As an aside, Muller, in a March 2010 talk (near the end) clearly states that if warming is on the high range, then humanity should be more concerned because we have “less time to react.”

What’s even more worrisome is that the study clearly shows that the warming trend is accelerating.  First, “Our analysis technique suggests that temperatures during the 19th century were approximately constant (trend 0.20 ± 0.25 C/century).”  No big surprise there.

But then as human emissions kick into overdrive, things heat up:

The trend line for the 20th century is calculated to be 0.733 ± 0.096 C/century, well below the 2.76 ± 0.16 C/century rate of global land-surface warming that we observe during the interval Jan 1970 to Aug 2011.

That is, in the past 40 years, the land has warmed nearly 4 times faster than it did in the last century.  This really kills the denier meme that the observed data  suggests only a small amount of warming this century.  In fact, even the warming of the past 4 decades was reduced by human and volcanic aerosol emissions and the general lags between emissions and warming.  Thus, it is now patently obvious that if we stay on our current emissions path, the acceleration of warming will continue as greenhouse gas concentrations continue rising.  That’s without even considering the amplifying carbon-cycle feedbacks.

Another mini-bombshell in the paper, which has led co-author Curry to frag team leader Muller, is this conclusion:

Though it is sometimes argued that global warming has abated since the 1998 El Nino event (e.g. Easterling and Wehner 2009, Meehl et al. 2011), we find no evidence of this in the GHCN land data.  Applying our analysis over the interval 1998 to 2010, we find the land temperature trend to be  2.84 ± 0.73 C / century, consistent with prior decades.

Now even though Curry signed her name to this submitted journal article, she apparently doesn’t believe it’s true.

The pseudo-journalist David Rose of the UK’S Telegraph got a bunch of quotes from her in a piece headlined, “Scientist who said climate change sceptics had been proved wrong [aka Muller] accused of hiding truth by colleague” [aka Curry].  It is exceedingly difficult to know what Curry is saying because

  1. It is always difficult to know what Curry is saying (see Hockey Stick fight at the RC Corral).
  2. Rose generally isn’t reliable (see “David Rose destroys his credibility and the Daily Mail’s with error-riddled climate science reporting” and links therein).
  3. Curry has  already walked back some of her comment’s (see here post here, but put a head vise on first, please).

But, she does say on her blog, “In David Rose’s article, the direct quotes attributed to me are correct.”

Still, neither she nor Rose appear to know what they are talking about.  Nor does Curry appear to have read the paper she put her name on.

Tamino has sorted out the statistics in his post, “Judith Curry Opens Mouth, Inserts Foot.”  He notes at the end:

Judith Curry protests that she was misrepresented by the article in the Daily Mail, and several readers have mentioned that David Rose, the author of the article, is just the man to do such a thing. It’s easy to believe that she was indeed the victim of his malfeasance.

But even after reading this post, she still hasn’t disavowed the statement “There is no scientific basis for saying that warming hasn’t stopped.” In fact she commented on her own blog saying, “There has been a lag/slowdown/whatever you want to call it in the rate of temperature increase since 1998.” Question for Curry: What’s your scientific basis for this claim?

In his post, Tamino shows there is no scientific basis for the Curry’s claim at all:

Judith Curry’s statement is exactly the kind of ill-thought-out or not-at-all-thought-out rambling which is an embarrassment to her, and an embarrassment to science itself. To spew this kind of absolute nonsense is shameful. Judith Curry, you should be ashamed of yourself.

If I may offer an imperfect analogy, suppose your kid averages 70 in his ten math tests in 7th grade, and then averages 80 in ten tests in 8th grade and then averages 90 in ten tests in 9th grade.  Is your kid getting better in math?  What if your kid got the same exact yearly averages but had one 100 toward the end of 8th grade and one 100 toward the end of 9th grade.  Does that suddenly mean your kid didn’t get better in math in 9th grade?

The deniers and confusionists would have you believe so.  In fact, Tamino shows that the warming trend is real in the Berkeley data even if you start the trendline fairly recently:

Warming Rate vs. Start Year

That shows just how mistaken, how foolish, how downright boneheaded it is to say that “There is no scientific basis for saying that warming hasn’t stopped.”

Curry tried to frag Muller, but dropped the grenade on herself.

Related Post:

Posted via email from The New Word Order

Mapping The Human Era

via The Daily Dish | By Andrew Sullivan by Andrew Sullivan on 10/30/11

Some scientists and scholars are calling today's era the Anthropocene, a unique geological age where human activity, not natural processes, is the principal driver of planetary change. Nate Berg interviewed Felix D. Pharand, about his map exemplifying the Anthropocene, above, featuring population centers, transportation routes and energy transmission lines:

The biosphere is made out of living matter. ... It is a world where humans appeared only recently. Now, indeed, our species and its 7 billion people is still growing inside it, converting ever more wilderness areas into human-influenced landscapes. This world is however finite, unique and fragile. Now is a good time to start thinking of it this way. I believe we are still, in our heads, living in a pre-Copernician world. It’s time to upgrade our worldview.

(Video: Anthropocene Mapping from Globaïa on Vimeo.)

Posted via email from The New Word Order

Thursday, October 27, 2011

Steve Jobs Called Fox News a 'Destructive Force in Our Society' [Valleywag]

via Gawker by Ryan Tate on 10/25/11

Rupert Murdoch must have imagined Steve Jobs would be a feisty dinner guest. Even still, the News Corp. chairman couldn't have foreseen that, in one night at the mogul's Carmel, California ranch, Jobs would call his tech people incompetent, get a guy fired, and say that Fox News was literally destroying the world. More »

Posted via email from The New Word Order

Wednesday, October 26, 2011

Lori Day: Restoring Sanity in the Blogosphere: Discovering Godwin's Law

via Technology on HuffingtonPost.com by Lori Day on 10/25/11

I have never been so excited about writing a blog post as I am right now. You know that feeling you have when you find out that something incredibly bizarre and horribly annoying that you've noticed and discussed with people for years is not in your head, and actually has a name, and even its own Wiki?! A total sanity-validator. More in a moment...

Yesterday I was reading Women Making Slow, Sure Strides in Science, Math right here on the Huffington Post, and I was aghast at the comments, a large number of which demonized women for this small success, insisted that female achievement takes something away from men, and revealed many incorrect beliefs and misunderstandings of the facts. The usual gender war had broken out early on in the thread, and the attacks were customarily vicious. Even though there was nothing surprising about this, as I see it every day, I could not help swallowing my usual bitter dose of disillusionment.

I started composing a comment. This was going to be the magnum opus of all flamewar-ending comments. This was going to set everyone straight on the facts. I was going to detail my experiences on the undergraduate admissions committee at MIT, my training as an educational psychologist, and provide links to some of the articles I have written about the gender skewing of college admissions, the history of female underrepresentation in STEM careers, and Why Boys Are Failing in an Educational System Stacked Against Them (written as an advocacy piece for boys right here on HuffPost).

Typing away, I quipped to my husband, "Why do I waste my time? Someone will just call me a feminazi." And to my great surprise, his reply was, "Yup. Godwin's Law."

Godwin's Law? What was that? Well, fellow draft dodgers of the eternal flame wars, allow me to tell you. Back in 1990, at the very dawn of the Internet Age, Mike Godwin, an attorney who was one of the early cyber ethicists, observed the following: "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one."

In other words, "Given enough time, in any online discussion--regardless of topic or scope--someone inevitably criticizes some point made in the discussion by comparing it to beliefs held by Hitler and the Nazis." And once this gun is unholstered, the thread is finished and whoever shot out the Nazi comment has not only lost his or her own credibility, but has ruined the discussion thread for everyone else because the piling on has begun. Once a thread has devolved into this kind of rhetoric, there is no saving the original topic.

Godwin created his Law essentially as a counter-meme. As a frequent contributor to UseNet back in the early days, he was concerned about the casual, hyperbolic, and frequent references to Nazis being not only a distraction and diversion in comment threads, but being actually disrespectful to victims of the Holocaust by trivializing that horror. His idea was to try to cancel out the Nazi meme with one of his own. Godwin's Law became a wildly popular citation within comment threads, and, like all good neologisms, quickly morphed into a verb. One could now say, when Nazi-shaming trolls had hijacked his or her article or comment on an article, "I've been Godwinned." Or, the people insisting on their inalienable rights of free speech regarding anything related to the Third Reich could say, upon push-back to their Hitler comparisons, "Don't Godwin me." When someone invoked the Law to try to settle down the thread, all bets were off as to whether things would calm down or heat up, but they usually became volcanic. Not much has changed.

One of the funnier offshoots of Godwin's Law is Bright's Law, created by some guy named Peter Bright: "If you cannot work out whether someone is trolling or merely stupid, the answer is probably both."

As a prolific reader and writer of Internet blogs, hardly a day goes by where I do not see someone stem-winding someone else by calling them a Nazi. These people have no inkling of the depth of their own embarrassment and shame. That seems to be evidence that Godwin's Law has not "worked" as a counter-meme, but how could it? There is just way too much satisfaction people get from insinuating genocidal mania in other people to bolster their own views. Whether the blog topic is related to gender, politics, or recipes, sooner or later, someone will torpedo the thread with their anger management problems.

When it comes to politics, I notice several prevalent newer memes have popped up amidst the Balkanization of punditry and sound bite wisdom. There is the whole Osama bin Laden/Muslim/terrorist comment bomb that can be dropped without provocation. Then there is the whole tea bagger/neocom meme so popular in today's political discourse. But the point is, Godwin's Law explains all of it!

From now on, I plan to maintain greater composure whenever someone trollishly destroys a thread I'm reading or participating in, or attempts to sabotage an article I have written because -- and here's the key take-away -- Godwin's Law predicts this extremely aggravating phenomenon, allowing me to remain calmer because I anticipate and understand it.

And this is one of those gifts that keeps on giving, so I'm giving it to all of you. March forth into the blogosphere armed with this knowledge, and you, too, can keep your head from exploding every time you think that people cannot get any stupider. They can, they will, and it's not you, it's them.

Maybe that could be Day's Law.

You can connect with Lori on Facebook, Twitter, or Google+.

Posted via email from The New Word Order

Saturday, October 22, 2011

Echolocation event recorded in Lake Champlain

via Skeptic.com by Sharon Hill on 10/22/11

A paper in the pre-press stages presents evidence of an acoustic anomaly in Lake Champlain, home to the legendary “Champ” lake monster.

Echolocation in a fresh water lake. | Browse – Journal of the Acoustical Society of America.

In 2003, 2005 and 2009 sites on Lake Champlain were explored using computers with National Instruments Polynesia real‐time sound analysis, NI PCMCIA 6062‐E cards, DAT recorders, GPS, amplifiers, three vector sensors, two hydrophones, and a Nagra IV‐S5. ECHO Aquarium in Vermont facilitated recording of known lake inhabitants and non‐biological signals studied formed the basis for the control. Neither recordings nor literature indicate that the known native creatures echolocate. Combining wavelet applications, aiding in reduction in ambient noise in this opposing environment along with conventional analysis, the experiments have been able to conduct far reaching, low‐noise sound measurements and were capable of detecting signals the nature of which suggests the presence of some interesting and unexpected phenomena within the ranges and inherent structure of Beluga whale, killer whale, and dolphin echolocation. To protect Lake Champlain, further investigations into this acoustic anomaly is encouraged.

Credit: Scott Mardis on Monster Talk Facebook page

Scott Says the paper should be published in this same journal next year. Very interesting! It does not mean we jump to the conclusion that this is Champ but this is the sort of evidence needed and procedure followed to get scientific attention. There is an anomaly, let’s go see what it is.


Posted via email from The New Word Order

Dario Gil: The Next Era of Computing: Learning Systems

via Technology on HuffingtonPost.com by Dario Gil on 10/20/11

When IBM's Watson defeated two past champions on TV's Jeopardy! game show last February, it awoke many people to the awesome power of computing. Watson demonstrates that computers are at last becoming learning systems-capable of consuming vast amounts of information about the world, learning from it and drawing conclusions that can help humans make better decisions.

At IBM Research, we believe that learning systems will shape the future of information science and the IT industry, and that Watson represents a very significant step on that journey.
But every innovator needs a target to aim for, so, after the Jeopardy! challenge, we're searching for the next "grand challenge" to will drive the next advances in Information Technology. To help shape our thinking, we're engaging in a conversation about the future of computing with scientists and business leaders at an IBM Research Colloquium on Friday at the lab in Yorktown Heights, N.Y. The questions we're asking are straightforward: What should the next grand challenge be? How should we design it? How should we pursue it?

We want to throw a wider net, as well. The Jeopardy! contest inspired a team of IBM and university researchers to create a system that could beat the best Jeopardy! champions. What "grand challenge" would you choose? Hopefully, the colloquium and follow-up conversations will help us set an audacious goal.


The colloquium is part of an IBM centennial program designed to convene thought leaders - including leading scientists, academics, leaders of industries, public policy makers and IBM clients -- for a series of talks and panel discussions on transformational technologies and their potential impact on the world. In addition to addressing learning systems, there will be guest lectures at the colloquium about emerging, disruptive technologies that will change the computing landscape and help enable learning systems in the future -- biologically inspired nanosystems, exascale-level processing and the analysis of massive quantities of data from multiple sources.

The decision to focus on learning systems for this particular lab event emerged out of a year-long project that was connected to the IBM centennial. The leaders of IBM Research asked a group of us to look out decades into the future and identify the most important trend in computing that we believe will be a major focus of interest over that long time span. After much deliberating, we chose learning systems.

We picked this topic, in part, because of our belief that for all that computing does for us today, it doesn't yet do nearly enough. We need new systems that can become our partners in expanding the horizon of human cognition to help us navigate the increasing complexity of our globally interconnected world. Until now, most electronic computers have been based on the "calculating" paradigm. Our expanding technology frontiers are providing us with the opportunity to build a new class of systems that can learn from both structured and unstructured data, find important correlations, create hypotheses for these correlations, and suggest and measure actions to enable better outcomes for users. Systems with these capabilities will transform our view of computers from "calculators" to "machines that learn", a shift that will radically alter our expectations of what computing is and the nature of problems it should help us solve. These systems will impact virtually every sector of the economy, enabling applications and services that will range from preventing fraud and providing better security, to helping launch new products, to improving medical diagnosis.

Achieving this level of performance will require advances (and sometimes breakthroughs) in learning algorithms and architectures, expanded data input and output modalities (e.g. the ability to process text, graphs, images, video, sound, and other sensory information) and novel device technologies that will exploit the latest semiconductor and nanotechnology advances (as an example, researchers at IBM are actively working on employing phase-change-memory crossbar arrays to mimic neuronal synapses, paving the way for a new class of biologically inspired neuromorphic computation).

We believe that there will be three phases in the learning systems revolution.
The first phase will be driven by "static" learning systems. The Watson system that was built to play Jeopardy! is a good illustration of a state-of-the-art "static" learning system. The term "static" is connected to the fact that researchers had to feed information to Watson, teach it how to play the Jeopardy! game and tweak the programming when they spotted flaws in Watson's game play.

In a second phase, which we call "dynamic," the systems will constantly mine information on their own from multiple domains via multiple sources, including text, video and audio. They'll engage in deeper reasoning, taking advantage the ability to performer higher levels of semantic abstraction to better understand how pieces of information relate to one another.
The third phase would involve "autonomous" learning systems. In this phase, the systems would achieve understanding of natural language, image, voice, emotion, and other sensory information; be able to self-formulate hypotheses and generate questions across arbitrary domains; and utilize the selection of multiple algorithms to learn autonomously.

At IBM, we believe that exponential growth in our industry has been achieved by a combination of continual improvement and disruptive innovation. Today, it's time for a huge disruption-learning systems. What are your ideas? What grand challenge should we choose?

Posted via email from The New Word Order

Foursquare CEO Dennis Crowley: “The Daily Deal Companies Are Version 1.0″

via TechCrunch by Erick Schonfeld on 10/20/11

Crowley 2.0

In Part II of my TCTV interview with Foursquare CEO Dennis Crowley, we get down to brass tacks: How will Foursquare make money? (In Part I, we talked about Radar, Siri, and how mobile interfaces are changing). Foursquare is already experimenting by partnering with various daily deal companies, including Groupon, to show nearby local deals to Foursquare users. But Foursquare is ultimately taking a different approach. “”The daily deal companies are version 1.0 of great things you can build with the Internet that help local merchants drive foot traffic into the door. What we are doing with Foursquare is version 2.”

Groupon is great at driving lots of customers into stores, he acknowledges, “but there is always a question of whether they are repeat customers.” Foursquare is focused more on loyalty—identifying loyal customers and rewarding them (with Mayor and Check-in specials). Crowley thinks the bigger opportunity is to give local merchants the data to segment their customers. People who check in a lot are loyal, those who don’t check in any longer are lost (and maybe there are ways to bring them back), and people checking into similar places in the vicinity are good potential prospects.

“The best thing about Groupon and Livingsocial is they taught an army of merchants that there are better tools,” says Crowley. “We know we are going to be very good at helping merchants identify their best customers and building the tools that drive new customers into the business.” With Radar and Explore, Foursquare is starting to recommend places to go. I asked Crowley if there would ever be paid recommendations popping up in Foursquare. It is not something he is planning, but he did not rule it out.

Of course, Groupon is also trying to come up with ways to reward loyalty and not just first-time visits. I pressed Crowley on what many see as Foursquare’s biggest weakness. There is no way to tell whether people who check in actually buy anything. Foursquare needs a way to close the redemption loop between an offer and a purchase. “We have thought of different ways to get involved in the payment process,” says Crowley. One way is to strike more deals with credit card companies like its AmEx deal, which offers check-in specials redeemed at purchase by swiping your credit card. Foursquare is working on getting some of that transaction data so that it can help merchants determine which promotions work and which ones don’t.

(Watch Part I of this interview as well).


Dennis Crowley is a co-founder of Foursquare, a location-based social networking service. Previously, he co-founded Dodgeball, a network of the same nature which sold to Google in 2005. He has been named one of the “Top 35 Innovators Under 35” by MIT’s Technology Review magazine (2005) and has won the “Fast Money” bonus round on the TV game show Family Feud (2009). His work has appeared in The New York Times, The Wall Street Journal, Wired, Time Magazine, Newsweek, MTV,...

Learn more
Company: Foursquare
Website: foursquare.com
Launch Date: November 3, 2009
Funding: $71.4M

Foursquare is a geographical location based social network that incorporates gaming elements. Users share their location with friends by “checking in” via a smartphone app or by text message. Points are awarded for checking in at various venues. Users can connect their Foursquare accounts to their Twitter and Facebook accounts, which can update when a check in is registered. By checking in a certain number of times, or in different locations, users can collect virtual badges. In addition, users...

Learn more


Posted via email from The New Word Order