From the outside in

Friday, December 21, 2012

6 Things the NRA Didn't Blame For Mass Shootings

In this morning's National Rifle Association (NRA) press conference, Executive Vice President Wayne LaPierre found a lot of things to blame for the Sandy Hook Elementary School tragedy, much rehashed from the NRA's past responses to mass shootings. Video games, the absence of armed policemen in schools, and pure evil made the list, as did Hurricane Sandy.

Here's what LaPierre didn't blame:

THE .223 BUSHMASTER SEMI-AUTOMATIC ASSAULT RIFLE

The weapon used by Adam Lanza when he massacred 26 children and adults at Sandy Hook Elementary School in Connecticut, according to the medical examiner (December, 2012). 

 

THE AR-15 ASSAULT RIFLE

One of the weapons used by James Holmes at a movie theater in Aurora, Colorado that killed and injured a total of 70 people (July, 2012).

 

.40-CALIBER GLOCK

The weapon used by Jeffrey Weise, who murdered 9 people and wounded 5 others on the Red Lake Indian Reservation in Minnesota (March, 2005).

 

GLOCK 19 SEMI-AUTOMATIC PISTOL

One of the weapons used by Seung-Hui Cho, who injured and killed 56 people at the Virginia Tech Campus (April, 2007).

 

AK-47

The weapon used by former Caltrans employee Arturo Reyes Torres who opened fire at a maintenance yard, killed 5 and injuring 2.

 

INTRATEC TEC-9 PISTOL

One of the weapons used by Eric Harris and Dylan Klebold, who opened fire in Columbine High School, injuring and killing 39 (April, 1999).

Photos courtesy of Academy Sports and Outdoors, Wikimedia Commons, Wikimedia Commons, Info4Guns, Deadliest Warrior, Arms List

Newtown Alumni Fund for the Sandy Hook and Newtown Communities

Home | Newtown Alumni Fund

Watch The World Not End All Day With This Live Stream From The International Space Station

If the world breaks open or shudders to a halt today, the folks on the International Space Station will be the first to know about it. Or, you know, the second. The first to know about it will be the folks who are suddenly swallowed whole by a raging Earth or thrown from the planet’s surface into the frigid, uncaring void of space, but they’re probably not going to be much for reporting back on what’s happening, what with all the screaming and crying and begging for mercy. Sissies. Anyway, if you find yourself needing reassurance that the world is not in fact ending, look no further than the ISS’s eye in the sky live stream, embedded below for your convenience. The feed will give you an astronaut’s eye view of all life on Earth… moving on uninterrupted in pretty much the way it does every day. Hey, don’t look at us — we said it was reassuring, not exciting.

Broadcasting live with Ustream

The live stream from the ISS isn’t perfect — depending on what’s happening on the ISS itself, where the station is in its orbit, and what kind of contact it has with the planet’s surface, the ISS Earth cam may stop showing the Earth and begin showing the inside of the station. The feed may even just go dark. If this happens, don’t worry — it is not a sign of the end times. Probably.

Don’t worry if your view from the feed camera is limited, though — if a serpent made of flame or a legion of demons emerges from the world’s dark places to shatter our peaceful existence, the folks at the ISS will no doubt mention it at some point. Those tend to be the sorts of things that dominate conversation around the water cooler.

(via International Space Station, image courtesy of NASA)

Relevant to your interests

For all of those still worried, NASA's got you covered...

Thursday, December 20, 2012

There’s a New Form of Magnetism, New State of Matter Thanks to MIT

We all learned in elementary school that the three states of matter are solid, liquid, and gas. Then if you took science classes in high school you probably learned plasma was a state of matter too. For most people, it stops there, but there are actually a lot more states matter can get itself into, and science just went and found a new one. MIT researchers have discovered a new state of matter, complete with its own unique form of magnetism. We can’t wait to see what the Insane Clown Posse has to say about this.

The new state of matter is known as a quantum spin liquid (QSL) and it’s been believed to be possible since 1987, but hasn’t been proven until now. The researchers at MIT spent months growing a small sample of a crystal known as herbertsmithite, a mineral named after mineralogist Herbert Smith, to test the hypothesis that it was a QSL. Turns out, it is. They published their findings in a paper titled Fractionalized Excitations in the Spin-Liquid State of a Kagome-Lattice Antiferromagnet. If you’re in a band, and that isn’t the name of your next album then you’re doing it wrong.

When most people think of magnets, what they’re thinking of are examples of ferromagnetism. All the electrons in ferromagnets have the same charge and are aligned in the same direction. It’s what causes the north/south poles of magnets. Antiferromagnets have electrons that face opposite directions, and basically cancel out the magnetic charge of the object. Herbestsmithite, or ZnCu3(OD)6Cl2 if you want to get technical about it, is a kagome-lattice antiferromagnet.

The new type of magnetism found in QSLs like herbertsmithite is unique because the electrons fluctuate direction. The strong interaction between the electrons prevents them from being locked into place. So, while the state of the crystal is solid, the state of its magnetic field is in flux.

Besides giving you a new state of matter to throw around the next time someone tells you there are only four, QSLs could potentially lead to new forms of magnetic data storage, communications, and superconductors that can operate at higher temperatures.

(via ExtremeTech, image via sparr0)

Relevant to your interests

Creators of Zork to accept Pioneer Award at DICE Summit, hide WIRED interview behind new text adventure


Creators of Zork to accept Pioneer Award at DICE Summit, hide WIRED interview behind new text adventure

If you've ever been eaten by a grue, you can blame Dave Lebing, Marc Blank and and a small team of their friends -- Zork, and the notoriously frustrating text adventure game genre that followed is all their fault. The games were challenging, but they were also the most complex narratives told through video games at the time, and their creators are finally getting their dues. Early next year, Blank and Lebing are slated to receive the Academy of Interactive Arts and Sciences' Pioneer Award at the DICE Summit. The name implies the details: the award honors those who helped pioneer the gaming industry with their early work, ultimately paving the way for the titles and hardware we enjoy today. How influential was the title? Too young for nostalgic reminisces of "interactive fiction?" Head on over to Wired for a lesson in history -- it's hidden its entire interview with Dave Lebling behind a text adventure of its own design.

Oh the endless hours I spent in the great underground empire Well deserved award. Congrats to all

Wednesday, December 19, 2012

It's official: Florida passes the millionth concealed weapons permit milestone

« Florida Republicans still mum on a response to Newtown shooting | Main | Atwater wants Scott to appoint inspector general at Citizens immediately »

It's official: Florida passes the millionth concealed weapons permit milestone

TALLAHASSEE -- Florida officials don't know when, but sometime in the last 24 hours someone received the 1 millionth concealed weapons permit in Florida -- making it the first state to reach that mileston.

"We have 1,000,645 concealed licenses as of this morning," said Amanda Bevis, a spokeswoman for Agriculture Commissioner Adam Putnam, who oversees the permit program. "Yesterday, we had 999,932, so we crossed the milestone at some point yesterday."

Putnam held a news conference last week -- two days before the Newtown, Conn. shooting -- that touted the milestone. While Putnam at the news conferernce said he wasn't celebrating the 1 million mark, the press release was titled: Firearm License Program is One Million Strong.

Posted by Michael Van Sickler at 4:09 PM on Wednesday, Dec. 19 | Permalink

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Ohh I feel much safer now...

Report: Data Caps Help Carriers Rake In Huge Profits

In case you weren’t quite certain why wireline and wireless carriers were capping your data, it’s not about bandwidth. Instead, they are able to charge a premium for faster speeds and more data, thereby raking in profits over and above what they’ve gotten in the past. In short, write Hibah Hussain, Danielle Kehl, Benjamin Lennett, and Patrick Lucey of the New America Foundation:

Internet service and mobile providers appear to be one of the few industries that seek to discourage their customers from consuming more of their product. The reason for this counterintuitive business model is that in the noncompetitive US marketplace, it is highly profitable.”

The paper itself goes into the history of bandwidth caps and isn’t saying much we didn’t already know. Basically the false scarcity imposed by service providers helps them maintain high profits while blaming data hogs. As data usage went up, however, the cost of providing that data went down.

Screen Shot 2012-12-19 at 3.32.21 PM

This combination of caps and higher prices also applies to services like text messaging. Even though “the cost a carrier incurs by transmitting an SMS message has not increased in recent years,”carriers have continued raising prices and imposing limits.[8]As with data caps, the prices charged to consumers do not correspond with the costs for carriers.

ISPs often claim that caps are necessary to curb “excessive use” and only affect a small fraction of users. Although some providers are reexamining their data caps policies, many of the limits imposed several years ago have largely remained static, even as typical household bandwidth consumption has substantially increased. In 2008, Comcast reported that its median residential broadband user consumed 2.5 GB of data monthly.[9]In 2012, Comcast reports that this number has quadrupled to a median monthly usage of 8-10 GB per consumer.[10]Other sources report even higher usage numbers. According to the Federal Communications Commission’s (FCC) Measuring Broadband America report, the median cable broadband user in the United States consumed about 28 GB a month in mid-2012.[11]As new Internet applications and devices continue to be created, yesterdays so called “bandwidth hogs” are today’s typical users.

We all know the carriers will come back with the old “But infrastructure costs money!” argument but considering the slow roll-out of FiOS and related high-speed data services, they’re clearly sitting on their cash hoard until enough of us complain. As the authors note, “Data caps encourage a climate of scarcity in an increasingly data-driven world.” And scarcity makes money.

via Ars

HELLO #ATT ?

#ATT unlimited data, NOT


So if you thought you had unlimited access using ATT cell service, think again...I received notice today that I exceeded their 3GB data limit for the month and my connections would be throttled from now on, I was told, "Sir, you have unlimited access, its just the connection speed that will be slowed down."
   I'm grandfathered in on the unlimited Data plan from the original iPhone offerings from years ago, of course I can upgrade to the $30/month data package which is also limited to 3GB.  I'm not a happy camper at the moment...Guess I can go back to 14.4?

After Newtown Massacre, Video Games Legislation Beats Gun Control Bills To Congress

Media_httpthinkprogre_wdmkj

And this is what happens when Faux News starts a campaign.

Tuesday, December 18, 2012

Utah Elementary School Student Brings Gun To School ‘For Protection’ Post-Sandy Hook

(Image via The Brady Campaign)

A sixth-grader in Kearns, Utah brought an unloaded handgun to his elementary school on Monday, reportedly at the urging of his parents.

According to the local Fox affiliate, the 11 year-old told his fellow students he was encouraged by his parents to bring the gun to school “for protection” following the shootings at Sandy Hook Elementary School in Newtown, Connecticut on Friday. Police are currently determining what role the parents had in the student’s actions, but the school acted quickly to disarm the boy after learning he had the firearm on school grounds:

The boy reportedly pulled the gun, a .22-caliber pistol, out of his backpack during recess Monday morning.

“At recess, he pointed a gun to my head and said he was going to kill me,” said Isabel Rios, one of the boy’s fellow 6th grade students.

Granite School District officials say students didn’t notify teachers about the weapon until 3 p.m.

Far-right advocates of looser gun restrictions have been advocating since Friday for more guns in schools to prevent tragedies like the one that occurred in Newtown. Among the proposals being floated are allowing teachers to bring guns to class in Oklahoma and arming teachers with assault rifles. These suggestions come despite a renewed support from the public to put stricter gun control laws into place.

Close

Want to see more great content like this?

Like ThinkProgress on Facebook

Just click the button below:

Don't show this to me again

By clicking and submitting a comment I acknowledge the ThinkProgress Privacy Policy and agree to the ThinkProgress Terms of Use. I understand that my comments are also being governed by Facebook, Yahoo, AOL, or Hotmail’s Terms of Use and Privacy Policies as applicable, which can be found here.

This is what's going to start
I don't feel any safer do you????

Monday, December 17, 2012

End of Mission: GRAIL Spacecraft Impact a Mountain on the Moon

Media_httputimagess3a_khjgd

A job well done, thank you #NASA

Cat Car turns your feline into a furry RC vehicle

Cat Car turns your feline into a furry RC vehicle

Lasers, Arudinos, cats doing funny things -- here's a student project custom built for the internet age. We popped by the Winter Show at NYU's ITP school to check out a new batch of works exploring the intersection between art and technology and couldn't help but be enamored by Cat Car, the "feline fitness frenzy." Designed to be a sort of exercise contraption for our furry friends, Sam Brenner's project eventually blossomed into something for more entertaining, though he assures us that "the safety and wellbeing of the cats involved [were his] top priorit[ies]." Cat Car uses a steering wheel controller with an attached Arduino and gyroscope / accelerometer, which communicate with a cat harness via an XBee. The movements control a servo on the back of the cat, which moves around a laser pointer, propelling the cat forward, thus allowing the user to remotely control the cat. A video of this magic can be found after the break.

Cat's as entertainment,

Friday, December 14, 2012

Google Updates Home Page With Vigil Candle For Victims Of School Shooting

Media_httptctechcrunc_ihayh

#prayfornewtown

Poem for this day

tears that end the day 
began with meteors across our sky
may all our hearts still be among the stars in heaven 
peace to all

Covering CT Shootings: Let’s Be Right, Not First

Check out this website I found at feeds.mashable.com

My sentiments exactly as I watched this story unfold throughout the day

The moon marks 40 years without a human visitor, prepares for impending probe crashes

NASA's last

It's likely not an anniversary anyone thought we would meet after the first moon landing, but today marks 40 years since Gene Cernan left the last footprint on the moon as Apollo 17 ended its mission. That was the last of six manned missions to the lunar surface (nine including those that didn't land), which saw twelve men actually walk on the moon in all. The years since have of course seen continued exploration of the moon through other means, though, and next week will see another major event when NASA's twin GRAIL spacecraft conduct a planned crash into a mountain near the lunar north pole. Those have been in orbit since January 1st, creating a high-resolution map of the moon's gravitational field and collecting data that promises to provide more detail than ever about the moon's internal structure and composition. You'll be able to follow along on NASA's website as that happens beginning at 5PM Eastern on Monday, December 17th.

[Image credit: NASA / Eugene Cernan]

Mississippi River Faces Shipping Freeze As Water Levels Drop

It's the second extreme event on the river in 18 months, after flooding in the spring of 2011 forced thousands to flee their homes.

Without rain, water levels on the Mississippi are projected to reach historic lows this month, the national weather service said in its latest four-week forecast.

"All the ingredients for us getting to an all-time record low are certainly in place," said Mark Fuchs, a hydrologist at the National Oceanic and Atmospheric Administration (Noaa) in St Louis. "I would be very surprised if we didn't set a record this winter."

The drought has already created a low-water choke point south of St Louis, near the town of Thebes, where pinnacles of rock extend upwards from the river bottom making passage treacherous.

mississippi

Tim McDonnell/Climate Desk

Shipping companies are hauling 15 barges at a time instead of a typical string of 25, because the bigger runs are too big for current operating conditions.

Barges are being sent off with lighter loads, making for more traffic, with more delays and back-ups. Stretches of the river are now reduced to one-way traffic. A long cold spell could make navigation even trickier: shallow, slow-moving water is more likely to get clogged up with ice.

Current projections suggest water levels could drop too low to send barges through Thebes before the new year—unless there is heavy rainfall.

Local television in St Louis is already dispensing doom-laden warnings about rusting metal and hazardous materials exposed by the receding waters.

Shipping companies say the economic consequences of a shut-down on the Mississippi would be devastating. About $7bn in vital commodities typically moves on the river at this time of year—including grain, coal, heating oil, and cement.

Cutting off the transport route would be a disaster that would resonate across the mid-west and beyond.

"There are so many issues at stake here," said George Foster, owner of JB Marine Services. "There is so much that moves on the river, not just coal and grain products, but you've got cement, steel for construction, chemicals for manufacturing plants, petroleum plants, heating oil. All those things move on the waterways, so if it shuts down you've got a huge stop of commerce."

Local companies which depend on the river to ship their goods are already talking about lay-offs, if the Mississippi closes to navigation. Those were just the first casualties, Foster said. "It is going to affect the people at the grocery store, at the gas pump, with home construction and so forth."

And it's going to fall especially hard on farmers, who took a heavy hit during the drought and who rely on the Mississippi to ship their grain to export markets.

Farmers in the area typically lost up to three-quarters of their corn and soy bean crops to this year's drought. Old-timers say it was the worst year they can remember.

"We have been through some dry times. In 1954 when my dad and grandfather farmed here they pretty much had nothing because it was so dry," said Paul McCormick who farms with his son, Jack, in Ellis Grove, Illinois, south of St Louis. "But I think this was a topper for me this year."

Now, however, farmers are facing the prospect of not being able to sell their grain at all because they can't get it to market. The farmers may also struggle to find other bulk items, such as fertiliser, that are typically shipped by barge.

"Most of the grain produced on our farm ends up bound for export," said Jack McCormick, who raises beef cattle and grain with his father. "It ends up going down the river. That is a very good market for us, and if you can't move it that means a lower price, or you have to figure out a different way to move it. It all ends up as a lower price for the farmers."

Page 1 of 2Next

Tuesday, December 11, 2012

24 ways: Design Systems

The most important part of responsive web design is that, no matter what the viewport width, the content is accessible in an optimum display. The best responsive designs are those that allow you to go from one optimised display to another, but with the feeling that these experiences are part of a greater product whole.

Responsive design: where we’ve been going wrong

Responsive web design was a shock to my web designer system. Those of us who had already been designing sites for mobile probably had the biggest leap to make. We might have been detecting user agents in order to deliver a mobile-specific site, or using the slightly more familiar Bushido technique to deliver sites optimised for device type and viewport size, but either way our focus was on devices. A site was optimised for either a mobile phone or a desktop.

Responsive web design brought us back to pre-table layout fluid sites that expanded or contracted to fit the viewport. This was a big difference to get our heads around when we were so used to designing for fixed-width layouts. Suddenly, an element could be any width or, at least, we needed to consider its maximum and minimum widths. Pixel perfection, while pretty, became wholly unrealistic, and a whole load of designers who prided themselves in detailed and precise designs got a bit scared.

Hanging on to our previous processes and typical deliverables led us to continue to optimise our sites for particular devices and provide pixel-perfect mockups for those device widths.

With all this we were concentrating on devices, not content, deliverables and not process, and making assumptions about users and their devices based on nothing but the width of the viewport.

I don’t think this is a crime, I think it was inevitable.

We can be up to date with our principles and ideals, but it’s never as easy in practice. That’s why it’s more important than ever to share our successful techniques and processes. Let’s drag each other into modern web design.

Design systems: the principles

What are design systems?

A visual design system is built out of the core components of typography, layout, shape or form, and colour. When considering the design of a whole product, a design system should also include patterns in user flow, content strategy, copy, and tone of voice. These concepts, design decisions or rules, created around the core components are used consistently across your product to create a cohesive feel, whether it’s from one element to another, page to page, or viewport width to viewport width.

Responsive design is one of the most important considerations in the components of a design system. For each component, you must decide what will unite the design across the viewports to maintain that consistent feel, and what parts of the design will differentiate in order to provide a flexible and optimal experience for different viewport sizes.

Components you might keep the same across viewports
  • typeface
  • base unit
  • colour
  • shape/form
Components you might differentiate across viewports
  • grids
  • layout
  • font size
  • measure (line length)
  • leading (line height)
Content: it must always be the same

The focus of a design system is the optimum display of content. As Mark Boulton put it, designing “content out, not canvas in.” Chris Armstrong puts the emphasis on not designing for viewports but for content – “we need to build on what we do know: content.” In order to do this, we must share the same content across all devices and focus on how best to display and represent content through design system components.

The practical: core visual components

Typography first

When you work with a lot of text content, typography is the easiest way to set the visual tone of the design across all viewport widths. It’s likely that you’ll choose one or two typefaces to use across the whole system, but you might change the most legible font size, balanced with the most comfortable measure, as the viewport width changes.

Where typography meets layout

The unit on which you choose to base the grid and layout design, font sizes and leading could be based on the typeface, an optimal reading size, or something more arbitrary. Sometimes I’ll choose a unit based on multiples of ten because it makes the maths in the CSS easier. Tim Brown suggests trying a modular scale. Chris Armstrong suggests basing it on your ideal measure, or the width of a fixed item of content such as an ad unit.

Grids and layouts

Sensible grid design can be a flexible yet solid foundation for your design system layout component. But you must be wary in responsive design that a grid might not work across all widths: even four columns could make for very cramped content and one-word measures on smaller screens.

Maybe the grid columns are something you differentiate across widths, but you can keep the concept of the grid consistent. If the content has blocks in groups of three, you might decide on a three-column grid which folds down to one column for narrow viewports. If the grid focuses on the idea of symmetry and has a four-column grid on larger viewports, it might fold down to two columns for narrower viewports. These consistencies may seem subtle, not at all obvious to many except the designer, but it’s all these little constants and patterns across the whole of the design system that makes design decisions easier to make (as they adhere to the guiding concepts of your system), and give the product a uniform feel no matter what the device.

Shape or form

The shape or form components are concepts you already use in fixed-width web design for a strong, consistent look and feel.

Since CSS border-radius became widely supported by browsers, a lot of designs feature circle themes. These are very distinctive and can be used across viewport widths giving them the same united feel, even if they’re not used in the same way. This could also apply to border styles, consistent shadows and any number of decorative details and textures. These are the elements that make up the shape or form of a design system.

Colour

Colour is the most basic way to reinforce a brand and unite experiences across viewports. The same hex colour used system-wide is instantly recognisable, no matter what the viewport width.

The process

While using a design system isn’t necessarily attached to any particular process, it does lend itself to some process ideals.

Detaching design considerations from viewport widths

A design system allows you to focus separately on the components that make up the system, disconnecting the look and feel from the layout. This helps prevent us getting stuck in the rut of the Apple breakpoints (brilliantly coined by Simon Foster of mobile, tablet and desktop. It also forces us to design for variation in viewport experiences side by side, not one after the other.

Design in the browser

I can’t start off designing in the browser – it just doesn’t seem to bring out my creative side (and I’m incredibly envious of you if you can; I just have to start on paper) – but static mock-ups aren’t the only alternative. Style guides and style tiles are perfect for expressing the concepts of your design system. Pattern libraries could also work well.

Mock-ups and breakpoints

At some point, whether it’s to test your system ideas, or because a client needs help visualising how your system might work, you may end up producing some static mock-ups. It’s not the end of the world, but you must ensure that these consider all the viewports, not just those of the iDevices, or even the devices currently on the market. You need to decide the breakpoints where the states of your design change. The blocks within your content will always have optimum points for their display (based on their hierarchy, density, width, or type of interaction) and so your breakpoints should be based around these points.

These are probably the ideal points at which to produce static mockups; treat them as snapshots. They’re not necessarily mock-ups, so much as a way of capturing how your design system would be interpreted when frozen at that particular viewport width.

The future

Creating design systems will give us the flexibility we need for working with the unknown devices of the future. It may be a change in process, but it shouldn’t be too much of a difference in thinking. The pioneers in responsive design have a hard job. Some of these problems may have already been solved in other technologies or industries, but it’s up to the pioneers to find those connections and help us formulate solutions and standards that will make responsive design the best it can possibly be. We need to keep experimenting and communicating, particularly in the area of design, as good user experiences are the true sign of whether our products are a success.

Like what you read?

Chart: Michigan’s “Right to Work” law contains verbatim language from ALEC model bill

Why Isn’t This a Front Page Story Nationwide?

On November 20th 2012 I told you about a guilty plea  taken by Lorraine Brown, the founder of DOCX (later known at LPS), in federal court in Florida. The press release for that plea did not come out until after 5 PM on the Tuesday before Thanksgiving. On the Wednesday before Thanksgiving most of the reporters who usually occupy the front pages of our newspapers and network news were presumably traveling or preparing for their holiday. The story was barely reported.

Lorraine Brown also pled guilty earlier that same day in state court in Missouri. She is rumored to be in plea negotiations in other states.

Even though this is no longer breaking news, it still belongs on the front page of every paper in the country and should be the lead story on every newscast. I’ll tell you why:

Whatever the banks thought about the robo-signing being “sloppy” before, once Lorraine Brown admitted that virtually every document coming out of DOCX/LPS was a forgery and that ALL documents coming out of DOCX/LPS were suspect, the banks that had court cases pending using DOCX/LPS documents had an obligation to either withdraw the documents and/or withdraw the lawsuits and other foreclosure proceedings.

It is a crime (common law fraud) to knowingly use a false, perjured, forged, fraudulent document as “evidence” in court. The specific statute violated will vary from state to state, but it is impossible to conceive that there is a single state where this is legal. If I’m wrong about that, I’m sure someone from the fraud-allowing state will set me straight in the comments. This is certainly a violation of federal criminal law, for example 18 USC §§ 371, 1341, 1342, and 1343 and 39 USC§§ 1341 and 1342.

I have seen no evidence that there has been a wholesale withdrawal of DOCX/LPS documents from evidence or a large scale voluntary dismissal of cases or even letters sent to chief judges saying that the cases should be stayed until they can perform the mechanics of withdrawing the fraudulent “evidence.” Nope, the cases I have been following are still going strong and being prosecuted vigorously.

This also means that the 50 state settlement notwithstanding, the Department of Justice and every attorney general in the country have a brand new, slam dunk, open and shut case against every single bank that is still allowing a foreclosure case to go forward based on DOCX/LPS false evidence.

THAT, my friends, is front page news. And getting it to the front page is essential so that judges know they are being hoodwinked, and homeowners know they should be making motions to dismiss. DOJ clearly knows about the Lorraine Brown guilty plea since the 13th  hour press release came out of Main Justice instead of from the Middle District of Florida press office.

So, I’m asking for your help. Tweet this. Send a copy as a Letter to the Editor (or re-write your own using this information). Send it to the network and cable news shows you watch. Put up your own blog post about it. Email it to your favorite reporters. I don’t care if my little blogpost is the form you promote, just promote the story of the brand new massive fraud on the court that is occurring right now. The banks, and their lawyers, have no excuse. They can no longer harbor any belief, good faith or not, that those documents are not false and fraudulent.

Photo by steakpinball under Creative Commons license.

Google60 makes your web search slower, noisier, and exponentially more amazing

Media_httpcdn3sbnatio_gjhqx

This is fun

Eyes on Andromeda

Eyes on Andromeda

By Julianne Dalcanton | December 11, 2012 4:30 am

The practice of astronomy is different than it used to be.

Back in the day, the image was of the lone astronomer, sitting at their telescope, communing with the universe.  Over time, we got more use to the idea that maybe groups of astronomers might come together to work on a common project.  But still, there were fairly tight connections between astronomers and their data.

Over the last decade and a half, something fundamental has changed.  Data has gotten big. So big, that it’s impossible for any one person to make sense of it.  More importantly, data of these sizes make it impossible to “notice” anything.  The line of research that probably got me tenured was based on “noticing” something interesting in several dozen galaxies.  But how do you “notice” something in hundreds of terabytes of data?

The standard answer these days is (naturally) computers.  Computer science is great at problems like this, and many astronomers are working on the interface of CS these days.  But that said, there are some problems that software is simply lousy at.  So what do you do when your scientific interests run smack into a problem that you can’t code your way out of?

Which brings me to the Andromeda Project. For the past 2-3 years I’ve been running a ridiculously huge Hubble Space Telescope (HST) program to map out a big chunk of the Andromeda galaxy (see here for the project web site, here for a more friendly introduction, and here for more technical details than you’d ever want to know).  The project is great — we’re measuring several dozen properties of more than 100 million stars (or, as I prefer to think of them, 0.1 billion stars), using light from the ultraviolet, optical, and near-infrared.  But we’ve easily passed into the new world of Big Data.

There are countless projects we’re hoping to do with these data, and for those that deal with individual stars, we’re in great shape. But, we had one big problem.  Stellar clusters.

Stellar clusters are groups of stars that formed from the same gas cloud, at the same time, with the same chemical composition. They are probably the dominant birthsites for young stars, and are incredibly important for understanding all sorts of things about the life cycles of stars.  However, they are a remarkable pain to try to find with a computer.  Believe me, we’ve tried.  And tried. And tried some more. But in the end, nothing works as well as humans.

So what to do? Well, at first we had 8 PhD-level scientists spend months looking at a small fraction of the data. And while that worked, there was a lot of other important stuff that those 8 PhD-level scientists could have been doing instead.

Luckily, we found a solution, through collaborating with the Zooniverse crew.  The idea behind the Zooniverse is that anyone can be a citizen scientist, with a little bit of training and the right kind of project.  Finding stellar clusters was perfect for this approach — the data is gorgeous, the problem hard but not impossible, and the routine task is straightforward and actually rather fun. The Zooniversians worked closely with my team to make an unbelievable web site that makes searching for clusters simple for anyone.

At this point, we’ve been live for less than a week.  In that short space of time, many thousands of people from all over the world have performed several hundred thousand searches for stellar clusters in our library of Hubble images.  As a scientist, I’ve been blown away by people’s enthusiasm for the project, and the careful work they’ve done.

However, we’re nowhere near finished.  If you have a little time on your lunch break, please click on through to http://www.andromedaproject.org and give us a hand!

 

 

CATEGORIZED UNDER: Science, Science and Society, Space

Comment on this blog post
Want to leave a comment on this post?
Login or create an account with DISCOVERmagazine.com today to join the discussion. New account creation is quick and easy.

Comment Approval Required
Your comment has been submitted, however it needs to be approved by a site moderator before it will be displayed on our site.
Close

Another exceptional project from the Zooinverse team, well worth your time and effort

Why Do Republicans Seem So Stupid? Well... Because They Are

Why Do Republicans Seem So Stupid? Well... Because They Are

>

In fact, a more productive question might be to ask why stupid people are attracted to conservatism and to the Republican Party. Digital Journal released the results of a four year study of the Republican base-- Fox News viewers-- and they found exactly what anyone who has spent any time talking to one of them would expect: they're really stupid. Not just ignorant-- which, of course, they are-- not, far worse, incapable of processing abstract thought.

The results of a 4 year study show that Americans who obtain their news from Fox News channel have an average IQ of 80, which represents a 20 point deficit when compared to the U.S. national average of 100. IQ, or intelligence quotient, is the international standard of assessing intelligence.

Researchers at The Intelligence Institute, a conservative non-profit group, tested 5,000 people using a series of tests that measure everything from cognitive aptitude to common sense and found that people who identified themselves as Fox News viewers and 'conservative' had, on average, significantly lower intelligent quotients. Fox Viewers represented 2,650 members of the test group.

One test involved showing subjects a series of images and measuring their vitals, namely pulse rate and blood pressure. The self-identified conservatives' vitals increased over 35% when shown complex or shocking images. The image that caused the most stress was a poorly edited picture of President Obama standing next to a "ghostly" image of a child holding a tarantula.

Test subjects who received their news from other outlets or reported they do not watch the news scored an average IQ of 104, compared to 80 for Fox News viewers.


Lead researcher, P. Nichols, explains, "Less intelligent animals rely on instinct when confronted by something which they do not understand. This is an ancient survival reaction all animals, including humans, exhibit. It's a very simple phenomenon, really; think about a dog being afraid of a vacuum cleaner. He doesn't know what a vacuum is or if it may harm him, so he becomes agitated and barks at it. Less intelligent humans do the same thing. Concepts that are too complex for them to understand, may frighten or anger them."

He continues, "Fox News' content is presented at an elementary school level and plays directly into the fears of the less educated and less intelligent."

The researchers said that an IQ of 80 is well above the score of 70, which is where psychiatrists diagnose mental retardation. P. Nichols says an IQ of 80 will not limit anyone's ability to lead happy, fulfilling lives.

The study did not conclude if Fox News contributed to lowering IQ or if it attracts less intelligent humans.

P. Nichols concludes that he wasn't shocked by the studies' results, rather how dramatic their range. "Several previous studies show that self-identified conservatives are less intelligent than self-identified moderates. We have never seen such a homogeneous group teetering so close to special needs levels."
Last year and again last spring researchers at Farleigh Dickinson University came to the conclusion that people who watch Fox News are the least informed of all TV viewers. If you have a brother-in-law anything like mine, you know that doesn't stop them from loudly and incessantly repeating the nonsense propaganda absorbed by their dull malleable brains. Fox viewers were compared with people who watch no TV, people who watch only The Daily Show With Jon Stewart, people who watch the Sunday morning talk shows and people who get their news and information from NPR. All categories were significantly more knowledgeable about current news events than Fox viewers, including people who watch no TV at all.

Fox News hit out at Farleigh Dickinson. A spokesperson for the network told the Hollywood Reporter, "Considering FDU’s undergraduate school is ranked as one of the worst in the country, we suggest the school invest in improving its weak academic program instead of spending money on frivolous polling-- their student body does not deserve to be so ill-informed."


Although he never mentions how the right-wing army is hollow, lame and incredibly dumb, Bill Kristol does mention in the new issue of The Weekly Standard that the conservative movement's "deep disarray" can be traced directly to the huckerism at its heart-- implying there are sheep too dumb to know they're being shorn over and over and over again.
Reading about some conservative organizations and Republican campaigns these days, one is reminded of Eric Hoffer’s remark, “Every great cause begins as a movement, becomes a business, and eventually degenerates into a racket.” It may be that major parts of American conservatism have become such a racket that a kind of refounding of the movement as a cause is necessary.
Yeah-- and with better followers.


Labels: ,

posted by DownWithTyranny @ 9:00 PM
2 comments  |  Reddit

Monday, December 10, 2012

The abrupt end to the 'epidemic of open-mindedness'

For critics of the Republican Party, today's GOP is plagued by intellectual stagnation, a lack of interest in creativity and problem-solving, and epistemic closure that deliberately repels independent thought and ideological diversity. For David Brooks, critics have it all wrong -- there's actually "a vibrant and increasingly influential center-right conversation" underway.

To bolster the point, the New York Times columnist trumpeted a "heralded paper on intellectual property rights" from "rising star Derek Khanna," a Republican Study Committee staffer. Brooks added, "Since Nov. 6, the G.O.P. has experienced an epidemic of open-mindedness. The party may evolve quickly. If so, it'll be powerfully influenced by people with names like ... Derek Khanna."

Alas, the "epidemic" didn't last. Industry lobbyists demanded that the Republican Study Committee withdraw Khanna's report, and GOP policymakers obliged. As of last week, Khanna, the "rising star" cheered by Brooks, suddenly finds himself out of work.

The incoming chairman of the RSC, Steve Scalise (R-LA) was approached by several Republican members of Congress who were upset about a memo Khanna wrote advocating reform of copyright law. They asked that Khanna not be retained, and Scalise agreed to their request.

The release and subsequent retraction of Khanna's memo has made waves in tech policy circles. The document argues that the copyright regime has become too favorable to the interests of copyright holders and does not adequately serve the public interest. It advocates several key reforms, including reducing copyright terms and limiting the draconian "statutory damages" that can reach as high as $150,000 per infringing work.

The memo was widely hailed by tech policy scholars and public interests advocates. However, it raised the ire of content industry lobbyists, who applied pressure on the RSC to retract the memo. The organization did so within 24 hours of its release.

To be sure, an "epidemic of open-mindedness" on the right would be a welcome development, but it remains nowhere in sight.

Sunday, December 9, 2012

Watch this: 'Decay,' a zombie movie made by physicists and filmed at the Large Hadron Collider

Zombie movies are a dime a dozen, but Decay, a full-length film released for free online today, offers an unusual setting for the rise of the living dead. Decay was shot on location at CERN by Luke Thompson, a University of Manchester physics Ph.D. student and first-time filmmaker, and its plot injects zombies into the search for the Higgs Boson, which was likely discovered earlier this year. Other physics students and at least one professor round out the small cast and crew.

In many ways, Decay is standard B-horror, but the dark tunnels around the Large Hadron Collider make for some fantastically creepy scenes. Thompson also hopes that it will satirize popular perceptions of science — the LHC, particularly, has been the epicenter for speculation about world-destroying black holes and other types of super-science. While CERN has no official involvement in the project (the film wasn't set in sensitive locations), it's told Wired that Decay "shows how pure science can stimulate creativity." You can watch the whole 76-minute movie on YouTube, or download it for free from the official site.

Friday, December 7, 2012

Scientists Ask Blunt Question on Everyone’s Mind

Many of us have wondered at some point in almost precisely these terms: “Is Earth F**ked?” But it’s not the sort of frank query you expect an expert in geomorphology to pose to his colleagues as the title of a formal presentation at one of the world’s largest scientific gatherings.

Nestled among offerings such as “Bedrock Hillslopes to Deltas: New Insights Into Landscape Mechanics” and “Chemical Indicators of Pathways in the Water Cycle,” the question leapt off the pages of the schedule for the American Geophysical Union’s fall meeting.  Brad Werner, a geophysicist at the University of California, San Diego, is one of the more than 20,000 Earth and atmospheric scientists who descended on downtown San Francisco this week to share their research on everything from Antarctic ice-sheet behavior to hurricane path modeling to earthquake forecasting. But he’s the only one whose presentation required the use of censorious asterisks. When the chairman of Werner’s panel announced the talk’s title on Wednesday, a titter ran through the audience at the naughtiness of it all.

Why shout out the blunt question on everyone’s mind? Werner explained at the outset of the presentation that it was inspired by friends who are depressed about the future of the planet. “Not so much depressed about all the good science that’s being done all over the world—a lot of it being presented here—about what the future holds,” he clarified, “but by the seeming inability to respond appropriately to it.”

Advertisement

That’s probably an apt description of legions of scientists who have labored for years only to see their findings met with shrugs—or worse. Researchers from the Tyndall Centre for Climate Change Research at the University of East Anglia, for instance, published a paper in Nature Climate Change this week showing that carbon emissions have reached record levels, with a 2.6 percent projected rise in 2012. In another AGU presentation, Pieter Tans of the National Oceanic and Atmospheric Administration posed the question: “Will realistic fossil fuel burning scenarios prevent catastrophic climate change?” He did not seem optimistic. “We might end up burning 900 billion tons of carbon” from oil, gas, and coal, he announced. “We can have a managed path to lower emissions—or do it by misery.” A guy next to me in the audience gave a kind of hopeless snort. The head of NOAA and polar experts held a news conference at the conference entitled, “What’s going on in the Arctic?” This year broke all sorts of records: the lowest recorded sea-ice extent, the lowest recorded snow cover extent and duration, and the most extensive recorded melting event on the surface of the Greenland ice sheet, among other milestones. “I’ve studied Greenland for 20 years now; I’ve devoted my career to it,” Jason Box of Ohio State University intoned somberly, “and 2012 was an astonishing year. This was the warmest summer in a period of record that’s continuous in 170 years.”

Werner’s title nodded at a question running like an anxious murmur just beneath the surface of this and other presentations at the AGU conference: What is the responsibility of scientists, many of them funded by taxpayer dollars through institutions like the National Science Foundation, to tell us just exactly how f**ked we are? Should scientists be neutral arbiters who provide information but leave the fraught decision-making and cost-benefit analysis to economists and political actors? Or should they engage directly in the political process or even become advocates for policies implied by their scientific findings?

Scientists have been loath to answer such questions in unequivocal terms. Overstepping the perceived boundaries of prudence, objectivity, and statistical error bars can derail a promising career. But, in step with many of the planet's critical systems, that may be quickly changing. Lately more and more scientists seem shaken enough by what their measurements and computer models are telling them (and not just about climate change but also about the global nitrogen cycle, extinction rates, fisheries depletion, etc.) to speak out and endorse specific actions. The most prominent example is NASA climatologist James Hansen, who was so freaked out by his own data that he began agitating several years ago for legislation to rein in carbon emissions. His combination of rigorous research and vigorous advocacy is becoming, if not quite mainstream, somewhat less exotic. A commentary in Nature last month implored scientists to risk tenure and get arrested, if necessary, to promote the political solutions their research tells them are required. Climate researchers Kevin Anderson and Alice Bows recently made an impassioned call on their colleagues to do a better job of communicating the urgency of their findings and to no longer cede the making of policy prescriptions entirely to economists and politicians.  

Lonnie Thompson, one of the world’s foremost experts on glaciers and ancient climates, framed the dilemma in a speech he gave to a group of behavioral scientists in 2010:

Climatologists, like other scientists, tend to be a stolid group. We are not given to theatrical rantings about falling skies. Most of us are far more comfortable in our laboratories or gathering data in the field than we are giving interviews to journalists or speaking before Congressional committees. Why then are climatologists speaking out about the dangers of global warming? The answer is that virtually all of us are now convinced that global warming poses a clear and present danger to civilization.

That’s the sound of serious-minded scientists fretting out loud to the rest of us that the earth is indeed f**ked, unless we get our s**t together. More and more are willing to risk professional opprobrium to drive that message home.

Like Emily Yoffe's Dear Prudence Column on Facebook:
SINGLE PAGE
Page: 1 | 2

MYSLATE

MySlate is a new tool that lets you track your favorite parts of Slate. You can follow authors and sections, track comment threads you're interested in, and more.

SLATE'S MOST VIRAL

 

Thursday, December 6, 2012

NRA Pushing For Elimination Of Crime Spree-Solving Gun Registry

Media_httpthinkprogre_kkspq

What happened to MI they are killing healthcare, unions and now allowing unregistered guns...oh wait, GOP must have come into power

Wednesday, December 5, 2012

Augmented Reality: Metaio Creator 2.0

*I wonder how much “simpler” that can get.

“Published on Oct 30, 2012 by metaioAR

“The Metaio Creator is a revolutionary product, allowing nearly anyone to take print media and attach websites, video, 3-D models, graphics or any other digital content using the latest image recognition, visual search and augmented reality technology by Metaio.

“The newest version of the Metaio Creator updates the user interface and experience to make it even more intuitive and familiar; creative professionals need only to drag, drop, point and click their way to their first AR experience — all in just minutes.

“Download the latest metaio Creator at www.metaio.com/products/Creator
Category:
Howto & Style

*There’s gonna be more out of Metaio soon, because they’re sending me embargoed press releases. I hate those. Does anybody really think that the “press” can enforce a ten-day delay of information nowadays? Either it goes viral or it doesn’t go at all.

Sir Tim Berners-Lee: 'The UN Should Not Run The Internet'

Inventor of the world wide web Sir Tim Berners-Lee has warned that the UN should not be allowed to 'run the Internet'.

The British communications pioneer told the BBC that he was concerned about elements of a meeting of communications officials in Dubai where a new treaty is under discussion.

Berners-Lee said it would be "a disruptive threat to the stability of the system" if the UN was to extend its influence over the web.

Currently the Internet is controlled by a range of groups, many based in the United States.

They include Icann, which is a nonprofit group in California that maintains the web address system on behalf of the US government.

Some countries including Russia have argued that the UN's International Telecommunications Union should play a greater role in the management of the internet.

A clause put forward by Russia for a new telecommunications treaty at the World Conference on International Telecommunications says:

"Member states shall have equal rights to manage the internet, including in regard to the allotment, assignment and reclamation of internet numbering, naming, addressing and identification resources and to support for the operation and development of basic internet infrastructure."

That has been taken by some as a move to switch powers from Icann and other bodies to the ITU, or another UN-run agency.

But Sir Tim Berners-Lee, who is the director of the World Wide Web Consortium standards group, said that could endanger the fundamental principles of the web - and would probably just be less efficient.

"I think it's important that these existing structures continue to be used without any attempt to bypass them," he said, according to the BBC.

"These organisations have been around for a number of years and I think it would be a disruptive threat to the stability of the system for people to try to set up alternative organisations to do the standards."

He added that countries who want the ability to block and filter the Internet more easily in their countries should be resisted.

"A lot of concerns I've heard from people have been that, in fact, countries that want to be able to block the internet and give people within their country a 'secure' view of what's out there would use a treaty at the ITU as a mechanism to do that, and force other countries to fall into line with the blockages that they wanted to put in place," he said.

Monday, December 3, 2012

Impressive Voyager 1 Discovery Overshadowed By Mars Press Conference

Editorial: Back to the future with Low Power FM

Low Power Radio

Holiday gift idea for next year: your own hyperlocal radio station. It's a gift from the Federal Communications Commission (FCC). Non-profit community radio stations, which broadcast in a range from a few hundred yards to about three miles, can be a mode of self-expression, a valuable provider of town information and an antidote to radio homogenization up and down the broadcast dial.

The FCC has just established new rules governing the application process, authorization guidelines and operating conditions to encourage Low Power FM (LPFM) stations, and will be reviewing station proposals a year from now. Given radio's nearly undiminished clout in the face of media encroachment by all kinds of digital news and music platforms, LPFM represents a back-to-the-future power play on behalf of independent programmers.

A radical embrace of post-1995 media technology might not include listening to the radio. Radio is the oldest of old school. The first relevant electromagnetic wave experiments happened during Civil War times, and Guglielmo Marconi obtained his first radio-related patent in 1885. (He purchased it from Thomas Edison. A presage of present-day patent maneuvers?) Marconi gave a successful transmission demo in 1895. He established a radio station the next year, and built a radio receiver factory the year after that. That's what you call a successful startup. The first commercial station in the US (KDKA) was fired up in 1920.

In my teenage years, radio was it for music discovery engines -- passive, non-interactive, highly curated, repetitive, in some cases corrupt. And it got worse as gigantic media holding companies created standardized station chains, in some cases eliminating all personalization and local programming. Corruption continued. Music discovery was rejuvenated on the internet by file-sharing, music subscriptions, YouTube and social streaming platforms.

Radio's listenership hasn't suffered from technology disruption as much as you might expect. Use of radio is much higher than general use of the internet, and nearly twice the size of internet radio. For all the exploration advantages of interactive technology, radio benefits from a commanding ease of use. Its most ancient attribute, that it exists in the air, makes it refreshingly untethered compared to TV, harmonizing with a growing cord-cutting movement. And radio's iron grip on car listening, where the driver is busy interacting with the steering wheel, is its greatest installed advantage. (Indicators are developing that Pandora is stealing listener share from the AM/FM audience, as the internet-native streaming service infiltrates the auto cockpit.)

Prometheus Radio Project

Music lovers have alternatives to unadventurous radio. College stations, for example, are uniformly devil-may-care programmers, even as they vary in professionalism. But cozy regionalism is rare, and that void is recognized as a problem by the governing body of America's airwaves, the FCC. The commission planted a stake in 2000 when it sanctioned community radio stations operating at very low power (100 watts or less). As you've probably noticed, community radio hasn't taken off substantially in the ensuing 12 years. Only a single LPFM station is broadcasting in the top 50 radio markets, which, according to the Prometheus Radio Project, represents 160 million residents not enticed by small-scale rogue programmers.

Part of the uptake problem has resulted from lobbying pressure applied by corporate FM operators, who don't want highly differentiated competition eating into their one-size-fits-all output. Radio turf protection is played out on the frequency dial, with a barrier to entry called the "adjacency rule." This rule establishes how close two FM operating frequencies can be to each other. The FM dial is divided into "clicks," according to the odd-numbered FM decimal system that runs up from 87.5 to 107.9. The signal of a station located at 91.5 might suffer from interference by another station broadcasting one click away at 91.7, depending on power and location. FM turf boundaries are regulated by the FCC.

The 2000 FCC action was hobbled by a three-click adjacency rule, meaning that the 91.5 station was buffered by a frequency territory up to 92.3 (four clicks away) within a prescribed region and power signature. Of course it is preposterous to suppose that a 100-watt antenna in somebody's back yard would compromise the signal integrity and profit margin of a 30,000-watt urban powerhouse, and the MITRE study of 2003 wholly debunked the absurd protectionism.

A series of congressional bills led to the Local Community Radio Act of 2010, and last week's unanimous endorsement of rules, by the FCC's five commissioners, finalized the LPFM startup criteria. Most provocatively, frequency fences have been dismantled with a one-click adjacency boundary (meaning new stations must be two clicks away from existing stations). The FCC has also expanded the allowance of translators, which effectively replicate a station's signal in an outlying region, to three translators in highly populated areas and 20 of them in rural territories.

It's not easy to start a radio station. Commissioner Mignon Clyburn notes in her agreement opinion (PDF link) that 25 percent of existing authorizations have not been built. She asserts the FCC's intent to reduce speculative applications, but this is where idealism might collide with reality. The FCC commissioners declaim euphorically about the diversity potential, and the many aspects of local value, that a swarm of hyperlocal community broadcasters would bring to the airwaves. Clyburn cites a New Orleans LPFM station that remained on the air during Katrina when the big outlets went silent.

LPFM Radio Kit

The FCC wants serious applicants who have the technical and financial resources to build a permanent transmission facility, and intend to broadcast at least eight hours a day. That is one serious hobby. Certainly, past applicants who have been frustrated and denied by adjacency blackouts, and who can now stake a claim, may grab the opportunity. But the vision of a new world proliferating in a re-energized FM band might develop more slowly than hoped for by the advocates who created these progressive new guidelines.

What about demand? Does hyperlocal work in any medium? Town newspapers have been skewered by the web. Website alternatives to those newspapers remain unproven after years of attempts. My town has a public TV station, and wow, do I never watch it.

Radio is different though, at least for me, and I'm probably speaking for a potential audience of some undetermined size. There is a romance to radio for me, and there is the over-the-air ubiquity of it. And, of course, the car. Local programming could have an immediacy that might pull me away from yet another installment of NPR's Wait, Wait, Don't Tell Me on a Saturday morning -- especially since most existing radio franchises are available online.

As with any other medium, audience and traffic depend on the content. Programming freedom and individuation are the keystone values the FCC has protected in this action. For that the agency should be congratulated. And here's a holiday wish for its commissioners: May all their most rhapsodic visions for a frothing grass-roots broadcasting culture come true.


Brad Hill is a former Vice President at AOL, and the former Director and General Manager of Weblogs, Inc. He has a lifelong love of radio, and has worked as a broadcast DJ.