Don’t buy an Apple TV 2

March 6th, 2011

So I had a $100 gift card, and applied my usual logic: buy something random to play with. In this case, since the Apple TV 2 is $99, it seemed like a worthwhile experiment. I say experiment, because I fully expected to be ripping it apart and poking at its innards anyway, so whether or not it actually worked well wasn’t really an issue for me. But let me share some impressions, anyway.

The hardware is typically Apple. It’s very well done. Cheap, but rugged ARM based system with 8GB of flash, 256MB RAM, lots of other goodies. This is certainly the main reason I bought it – to rip it open and poke at the innards and not really care because I got it as a gift and it was cheap enough anyway. There’s enough hardware in there to make a reasonable Linux set-top-box, or perhaps (more likely) break without really feeling much was lost (see software below).

Unfortunately, the software is typically Apple, too. Not only is it so locked down as to be useless out of the box (though you can fix some of it by jailbreaking, adding XBMC, etc.), but it doesn’t even live up to Apple’s traditional standards. This is true especially considering that they have tried twice to do this right now. The latest attempt fails because there is no integration. Rather than a boxee-style home screen with the latest bits of media you might be interested in, this facebook generation is greeted with a 1980s style menu system that might have been designed on the back of a napkin (were it wide enough to do so), in about ten minutes. Here are some of the ways the software fails to be useful:

  • The remote use is horrific. No scroll wheel (iPod style), only a limited number of non-useful buttons. I’m sure if you own an iPhone it’s slightly better, but the out of the box experience is that you have a shiny remote control (lovely hardware), with a bad software experience.
  • The home screen presents nothing other than simplistic drop-down, non-customizable (until you install “Overflow”, not supplied by Apple) menus. Very 1980s, not very 1990s, and certainly not the 2011 “facebook” generation wherein home screens are supposed to pull in all the latest bits of media, recommendations, etc.
  • The integration with services like Netflix is an afterthought. Interested in browsing through your instant queue? Every time you go into a title and leave, it’ll take you back to the start of – if you’re like me – several hundred items that have to be clicked through to get back to the item you were on. Not very Apple.
  • The Podcast subscription service is utterly painful, and rather than showing you the latest podcasts, you have to individually navigate to items in your “favorites”. Again, it’s an afterthought, added so it seems useful, but you wouldn’t want to use it every day (or all all).
  • Media sharing with your existing Apple system kinda works, but doesn’t include Audiobooks (a jab at Amazon?), and is very clunky.
  • Radio doesn’t do favorites, etc. Again, looks like an afterthought.

Really, the only thing this is good for out of the box is as a means to give Apple more money to watch iTunes content. So, if you’re planning to rent movies, maybe it’s useful. But it is a secondary set-top-box that might be useful for watching iTunes. It does not, and will not, be something you want to use if you live in the modern world.


Trying to understand US education

February 13th, 2011

I’m overly critical sometimes, and it’s easy for me to think I have all the answers. So this is an information gathering blog post while I wait for something to finish on my computer. In short, I’m trying to figure out the difference in approach between US Middle and High School vs. my experience with British secondary education.

When I went to secondary school (age 11 – think combination of middle and high school), we had fixed class schedules for the first 3 years. In my case, it was required to study the following individual subjects from age 11 to age 14 (mostly determined by the National Curriculum):

  • Art
  • Biology
  • Chemistry
  • Civic Studies
  • Design Technology
  • English
  • French (later German)
  • Games (outdoor PE)
  • Geography
  • History
  • IT
  • Math(s)
  • Music
  • Personal Social Education (PSE – sex education, etc.)
  • Physical Education (PE)
  • Physics
  • Religious Studies

Now, I was at a private school, and some of these topics differ if you’re not, but not many (Latin was dropped the year I started…sadly). Most of these are mandated at various “Key Stages” of the UK National Curriculum as required, even if only for a year or so. In some schools, for example, they combine sciences, but you still have to study science from age 11 onwards. Classes were divided into 40 minute “periods”, with a bell in between, and 5 minutes to get to the next class. There were no “hall monitors”, you just asked if you wanted to go to the bathroom and they trusted you. Also, we all wore uniforms (complete with Blazer and Harry Potter style ties with different School Houses), and (private school bit) were required to stand when a teacher or adult entered the room, as a sign of respect. There were no metal detectors, and the most violent thing I recall ever happening was someone stealing some Potassium from the Chemistry lab.

US secondary education is highly regional in nature, and there are very few national standards (No Child Left Behind, etc. don’t actually attempt to set a national curriculum), so what you learn in one state will vary wildly from another, even down to how the Civil War is described (and thus cannot be agreed upon) in history class. It is my understanding that high school here is a lot more like what would be called “college” in the UK (which is not a University, but is instead an alternative system available to 16 year-olds), and middle school is a half-way point. As I understand it, it’s not required to study science, history, or geography beyond a very elementary level. Classes seem to be longer in duration but focused on fewer topics of study, with a lot more choice.

I made no pretence that I disfavor the notion of allowing children to opt out of classes they don’t like. For example, I suck at German…seriously. I just can’t handle the different genders. I would get everything right, except I would be unable to get the right one of the three possibilities. But I’m glad that I was required to study German. I’m also glad that I didn’t have a choice about studying a foreign language, or art, or other topics I might have chosen to avoid if I had had a choice in the matter (I consistently got over 100% in Religious Studies due to a bonus points system, but I might have opted out – I even once managed to get everyone out of an 80 minute test by keeping the teacher side-tracked in a discussion/debate on cryopreservation as applied to the second coming of Christ). After the age of 14, it was possible to drop certain subjects, but not all. For example, German got dropped :) but a foreign language was required, as was art, both subjects I might have chosen not to pursue with a choice in the matter.

Anyway. If you have links/stories about how secondary education works differently in the US, I would actually be interested. I can hopefully convince myself with enough actual data that not all schools here are just maximum security facilities with metal detectors, cops, etc. and other notions I may have.


Pipe dream: USB support for empeg

February 3rd, 2011

So I was daydreaming about USB support for the empeg again. I think I now (in my old age) understand enough to actually implement/achieve this (before it was a youthful pipe dream), but it won’t be simple and I don’t have any time for it soon – heck it’s been ten years or more since people first asked about this, what’s another year or two? Anyway, here is the basic concept, since it also applies to other systems without USB support.

The empeg has one IDE (ok then, ATA) channel and a controller that supports two devices, though typically only one disk is actually used (who needs more than one disk these days, with flash and huge sizes?). It ought to be possible (in theory) to use a SATA disk with the aid of an IDE->SATA adapter (these do exist) and a power rail, which prolongs life by a few years. More to the point, it is technically possible to expose an IDE-like device that essentially just passes through the 16-bit data bus to general purpose GPIO lines (after enough logic to fool the controller into thinking it’s a “disk”, and some hefty driver modification). With that done, and using PIO to drive data to/from an attached USB1.1 chipset, it would be possible to provide minimal support for USB at (perhaps) a few MB/second. Other peripherals could similarly be added, if a more complex protocol were built and driver support added.

It has been my intention for some time to learn enough about the ATA spec to implement a very basic disk-like device, using an old PC motherboard with a common IDE chip/driver as a test of the concept. I now think this might actually be doable in my lifetime, so a pet project this year is going to be to hack something up along those lines to give me GPIO lines from an “IDE” “disk” sufficient to light up some LEDs, etc. If I get that working, then I will solicit for input on the more audacious goal of driving a cheap USB chipset from it.


BoA SafePass Folly

January 31st, 2011

So BankOfAmerica got on the bandwagon of using cellphones to authenticate via text message a few years ago (did I mention that I came up with the idea for this long before it was commercially available, but I was beat to the patent? true story there – even had a meeting with an investor to discuss the idea at the time). You go to some silly webpage, click on a button, and they text a code to a cellular phone that must be entered in the vain of “something you have, something you know” (your phone, login), etc. Sadly, the BoA implementation is full of all kinds of wonderful FAIL. Let me explain. Because venting helps.

To save money (or whatever), BoA grab your carrier information and replace the phone number with a text gateway email gateway of the carrier in question – sign up using an AT&T phone and they will sent to (which is broken anyway as it only uses 10 digits, not the full globally unique number, with country code), and keep sending to AT&T no matter how long you have that number, or whether you move carrier. So, you might think that moving to T-Mobile will get you away from AT&T, but not as far as BoA’s systems are concerned. Now, some folks at BoA did ponder this problem (however briefly), and setup some automated process based around you texting “HELP” to 73981, which allegedly also causes it to wake up and smell the coffee (technical term). The problem is, as many attest online, this is error prone and often does nothing.

So I call BoA online “technical support”, and say nice things to the first person I speak to just to get rid of them, in order to speak to whatever “manager”, “supervisor” or other entity can actually help with the problem. I know what the problem is. And of course, I’m told “can you send a text to…” – at which point I explain that I know the number, the trick, and have done this 4 times over the past 3 days before even bothering to call, that I have no confidence in that working, and that I want a technical support ticket opened with higher ups. Eventually I get this. Guess what the ticket wants? Yup. The “SIM” email gateway for my new number. In other words, they want me to tell them how to send a text to my phone via my new carrier email gateway and:

  • Can’t figure this out for themselves (the reset exposed publicly doesn’t work)
  • Won’t provide a convenient means to do this online (enter gateway or whatever)

They also asked me several times for the make and model of my phone. As if that’s going to make a difference. Apparently, it can do for the iPhone (no idea why), but I have a sane phone, a Google Nexus S. And I can guarantee it’s nothing to do with the phone. I could have my T-Mobile SIM in the cheapest, nastiest, crappiest phone around, and it would be exactly the same problem.

Overall, I’m not very confident in the Safepass system. But, hopefully, at some point today, I can finally make a transfer between one account and another without having to engage in more folly.


PSTN call routing

January 31st, 2011

UPDATE: The bit I was missing was NPAC. All answered now.

So tonight, we ported over Katherine’s phone number from one provider to another. This got me thinking about number porting in general, and specifically the call routing implementation employed within the NANPA (North American Numbering Plan Administration) – the people responsible for assigning and managing USian/Canadian/other (now) non-Mexico NA numbers similar to the ARIN for the Internet – between carriers. I need to know.

In the case of the Internet, ARIN assigns top level netblocks, ASes, and other routable entities, and then various top-level network routers and ISP equipment broadcast routes for their 16-and-32-bit ASes. The global routing table is large but can still (mostly) fit in memory on big beefy routers because not everybody has their own AS (network). Instead, most people have a number assigned within a providers non-portable netblocks. When it comes to tele carriers, this isn’t true under modern NANPA. You can freely move your local phone number between carriers as you wish. The fact that you have a “617″ (Boston/Cambridge/etc.) or “415″ (best coast regional) number really means very little in practice as you might have moved a million times, and changed carrier too. Therefore, your number is neither really regional, nor a non-portable carrier assigned number.

Reasoning tells me that carriers can’t just announce routes for particular “blocks” of phone numbers any more because these are rapidly fragmented and hop between carriers. Nor does it seem to be practical to advertise announcements for each number individually. And yet, that seems to be the only way to truly do this right. One possibility is that such a level of route announcement is done, but at the local exchange level (the 759 in 617-759-XXXX) and even if I move regionally, there will still be an entry sitting in that exchange like in the good old days. But is it still like that? How does the routing between large telecommunications companies work in reality? I need to know. Preferably, I “need” an extremely large book that details this and the protocols involved in a ridiculous level of detail. Thanks!


NOTE: This falls under Obsessive Compulsive Need To Know. The kind of reasoning that has me signed up to real-time alerts from my regional ISO whenever power generation within MA falls below certain levels. No normal person would care about this level of detail in their life, and I know this ;)

Telepacific abuse

January 28th, 2011

So a couple of days ago, someone from Telepacific host:

Managed to illegally obtain SIP credentials for a VoIP account I have with one of my (awesome) phone providers. Using those credentials (directly, not by compromising my Asterisk server), they illegally registered as me using a soft phone client and began making international phone calls to exotic destinations. I’m grateful that various processes worked and they were cut-off after only spending $20, which is enough to be annoying, but at least not hundreds or thousands of dollars. So, then comes the aggressive action against the abusing moron.

We traced this IP, and the illegal registration(s), and phone calls. The address is owned by “Telepacific”, who seem to have changed their name or acquired “” at some point. The whois record for the address in question contains these choice tidbits:

OrgAbuseHandle: MIAA-ARIN
OrgAbuseName:   Mpower IP Abuse Administrator
OrgAbusePhone:  +1-877-642-4375

That email contact is out of date to begin with, but worse, the phone number listed is itself a SCAM service. When you call that number, you are invited to press * to sign up to some ongoing text message commitment. An abuse number is itself an “abuse number”! This is disgusting, wrong, and highly infuriating. I called ARIN, and a bunch of other organizations to have action taken against Telepacific over this, and I finally got through to Telepacific only to be fobbed off with some kind of email contact. I’m not optimistic that they’ll fix this, hence this handy blog posting.


On Linux Platforms

January 23rd, 2011

One of the major differences between Linux distributions, and other Operating Systems (both Free and non-Free) is that Linux often tries to give you everything from one source. Want a piece of third party software? You’re expected to get it (and its dependencies) into the distribution, and install that version(s). Other Operating Systems provide a base platform upon which third party tools, libraries, and applications can be installed into a separate location. This is close to the original intention of /opt, but it’s actually used rather than shunned is if it were some kind of bad idea to want to do this, and it allows one version of the basic OS to live for a number of years independently of any or all of the applications installed.

Unlike many distro folks and Linux enthusiasts, I actually prefer the idea of providing a basic, stable, unchanging platform upon which self-contained applications can be installed. Kinda like “Enterprise” Linux, but different – Enterprise Linux distributions basically snapshot a particular set of distro software and treat that like a “platform”, while their upstream sources don’t. In my perfect utopia, there’s a huge, bright line between basic OS components and everything else. I want a stable OS, but I might want to install a more recent web browser, or some engineering design tool that is more recent from my OS, and I want to be able to do that trivially and independently of the OS. I don’t want it installed in /usr/bin. I want my OS-supplied core junk to go in there, but I want my applications to live separately. Some experimental distros have even tried this stroke of sanity by cloning the OS X /Applications type of behavior, but only experimentally.

In my perfect world, I would get “Fedora” from the Fedora Project, I would install it, and I would get a basic environment including a desktop. It might even include a web browser, but it would not include all of the other stuff. Instead, this would be installed into completely separate directory structures, and be fully self-contained, away from the basic OS environment. It might be that some of it would come with the distro, and it might even be that some of it were packaged and distributed using distro tools, but it would be trivial to upgrade any software independently of the base OS platform because it would still be stored separately from core system components. Try installing a different version of Firefox, or some other system-supplied app on your favorite Linux distribution without having to place it into a separate directory, avoid using actual packaging, or butcher the distro config.

One day, what I want is going to happen. There will be a realization in the wider Linux community that consumers want a basic platform and that they want to be able to treat other pieces of non-core junk independently of that. But this realization (in the Linux space) is still several years away, and it comes after more people realize the benefit of having a computer that just works without the need for hacking or updating or messing around with OS pieces to get there.