Circuit Writer Version 5.9
by Jim Scheef
Network Neutrality is Still an Issue
It is truly a sorry state of affairs that network neutrality
is still an issue. Back in 2005, the FCC issued some net neutrality
rules that allowed carriers some exceptions for “reasonable
network management.” In a recent Senate hearing, FCC Chairman
Kevin Martin gave some guidance on what the agency considers
appropriate broadband network management practices. He also told
the assembled Senators that the FCC can enforce net neutrality
without the help of Congress. Even while the FCC investigation
of Comcast continues, he said that throttling specific applications
like BitTorrent is not likely to be among acceptable management
practices. You can read the eWeek article at tinyurl.com/6dxchx.
When Congress or any part of the government is involved, the
devil is always in the details. These details begin with the
very definition of "Network Neutrality" - who and what
is neutral and to whom? Is this an issue with charging certain
users extra, throttling BitTorrent, or applying a surcharge for
selected content like video? And, of course the biggie, who decides?
No matter what your opinion on the looming election (is it over
yet?), our Congress-people need to be reminded that net neutrality
is an issue we care about.
Security thru Obscurity
Before I jump into this, allow me to point out that this article
comes from something called “Dark Reading”, a website
under the TechWeb (techweb.com) umbrella which bills itself as “the
industry's most comprehensive security site for IT pros.” The
articles include all sides of computer and network security.
There are even some videos for a change of pace, but this article
caught my eye. “Proprietary Security Through Obscurity” (tinyurl.com/69y2zc)
is a short article about pacemakers – yes, those things
that keep some people’s heart beating properly. This started
in a New York Times report about an article on an obscure website
about how a heart device was found to be vulnerable to hacker
attacks (tinyurl.com/6h2mcl). It seems that a team of researchers
was able to reverse engineer the wireless interface for a combination
pacemaker and defibrillator. Using this knowledge they were able
to change the programming in the device to adjust it’s “pace”,
shut it down, read patient data, and even to give a potentially
lethal jolt.
Certainly patients (including Dick Cheney) with these devices
are under no imminent threat; however, newer devices are being
developed that can connect to the Internet to allow doctors to
monitor a patient remotely. This could mean better, more frequent
monitoring with fewer trips to the doctor’s office – a
win-win, except for what should be obvious. Of course, we (you
and me) understand the need for security on the Internet. In
the type of statement we hear all too often, the manufacturer
of the “hacked” device said, “To our knowledge
there has not been a single reported incident of such an event
in more than 30 years of device telemetry use, which includes
millions of implants worldwide.” Obviously if they don’t
know about it, it doesn’t exist.
One manufacturer said that it used “proprietary techniques” to
protect the security of its implants – and here is where
we get to what I call the “head in the sand” approach
to security, or security thru obscurity. Back in the day, Ma
Bell used obscurity to secure its long distance trunk lines from
unauthorized use. John Draper, aka Captain Crunch, demonstrated
that this is not a good approach with the 2600 Hertz tone produced
by a whistle packed in breakfast cereal. His discovery led to
a cottage industry making “blue boxes”, “red
boxes” and devices of other colors to make free long distance
calls. According to legend, Apple Computer may have been funded
from such sales.
So which is better, proprietary security or security based on
published protocols? The Open Source community argues that only
when the source code is published for peer review can we be reasonably
assured that any protocol works as advertised. The Internet protocols
are the best example. Openness is especially required for encryption.
Proprietary or “secret” encryption algorithms cannot
be verified by the community of experts. How do you know that
a proprietary encryption program doesn’t have a hidden
key or back door that lets the author or manufacturer – or
some other organization or government – read the encrypted
data? For an in-depth look at this issue, I recommend the book “crypto” by
Steven Levy (2001, Viking Penguin). See my review in January
2002, DACS.doc (dacs.org/archive/0201/feature3.htm). Read the
articles and see what you think.
My columns are available at http://circuitwriter.spaces.live.com/,
where there are more links and comments are welcomed. There is
even an RSS feed for those who cannot wait.
|