Odds and Ends

I swear I had three small things to talk about, but I can only think of two. Oh well.

The first is the new topology map in OpenNMS. As someone who really, really hates network maps, I love the direction the team is taking with them in the application. We have a geographical map which is just plain awesome, and now the topology map is starting to come together.

The topology map’s job is to show you how devices are related, and the beauty of it is that there is an API so you can determine exactly the relationships you want to see. For example, you could show Layer 2 connections, or, in a VMWare environment, you could display how host and guest operating systems are related to each other and to network storage. In the future we could have relationships between devices and applications. The possibilities are limitless.

Even Papa Johns Pizza has put it on the big screen.

The second thing, which is probably obvious but I still want to complain about it, is that iOS 7 sucks.

You might be asking yourself: Why do you care? True, Android is my mobile platform of choice, but my current phone is locked to the AT&T network. I tend to fall on the opposite side of the “unlocked phone” debate within the open source community in that I believe if you accept a discount on a device in exchange for being tied to a particular network for, say, two years, then you shouldn’t break that contract. So, when I go overseas to Sweden, I take an iPhone 3GS that is unlocked.

Now that my spouse has moved off of iPhone to Android, her iPhone 4 was up for grabs so I decided to get it unlocked.

The process was pretty simple, but Apple decided to force me to upgrade to iOS 7 in order to do it. So when Cult of Mac boasts that 71% of phones that can run iOS 7, do, they don’t take into account those of us who were dragged kicking and screaming into it.

And you can’t go back (Apple seems to have an odd definition of “backup” and “restore” in iTunes).

I hate almost everything about it. I hate the thin Sans font. I hate the Windows Metro icons. I hate the needless animations.

And I can’t find anything. It took me forever to figure out how to unlock the screen rotation. It used to be simple: double click the home button and swipe right. Now I found it buried on some settings page.

Anyway, since the biggest thing anyone is saying about the new iPhone is that “ooh, it comes in gold” I think Apple is in their twilight years.

While I didn’t always agree with him, I miss Steve Jobs. Not as much as I miss Lou Reed, but still.

In The "Why Didn’t I Think of this Before?" Dept.

As I was drifting off to sleep last night, a thought came to me:

Why don’t we have E911 apps for our smartphones?

I started my professional career at Northern Telecom (later called NORTEL) and in one of my roles I worked on Enhanced 911 software.

For those readers outside of the US, 911 is the emergency services phone number. It is supposed to be a single number people can call in order to reach fire, police or medical assistance. It’s similar to 999, 112, or 000 in other countries.

Back then everything was focused on land lines, and so the workflow was pretty simple. The phone switch would deliver the caller’s phone number (ANI) to a Public Safety Answering Point (PSAP). There, an operator would take the call, and if everything went correctly, information on the caller would be automatically loaded into the system. The operator would be able to see the caller’s address, and “soft buttons” would be loaded with numbers reflecting the closest police department, fire station or medical services.

For example, if the caller requested help with a fire, the operator would just push the “fire” soft button and the call would be transfered to the closest fire department to the caller.

This is a bit of an oversimplification, but the main point is that it was all tied to the caller’s location, which was static. I left before they started to seriously attack the issue of 911 calls from mobile devices, and the focus was switching from the caller’s phone number to the location of the equipment (e.g. the tower) that was handling the call, but pretty much the system has changed very little since the 911 service was introduced.

As my synapses were randomly firing as I was trying to sleep, I thought about what would happen if I had to call emergency services from an unfamiliar location?

Back in August, my wife and I were almost in a traffic accident. We were driving on the highway while it was lightly raining, and a white Ford Ranger that was trying to merge lost control. It crossed in front of an older model Escalade, clipping it in the left front fender, which caused the truck to spin 180 degrees, cross four lanes of traffic and slam into the concrete lane divider. It missed our car by feet, and only my spouse’s amazing driving skills kept our car from being hit (as the passenger I went all NASCAR pit crew with “go low! go low!” but I am not sure it helped much).

Out of 20+ cars that must have witnessed the accident, we were the only one to stop (well, besides the truck and the Escalade). I called 911 and was able to report our exact location (“heading eastbound on I-40 at the Hwy 55 exit – exit 278”) since I used to work near there, but what if I wasn’t familiar with the area? What if I was in a different country?

Considering the proliferation of smartphones (or as I prefer to call them, handys) wouldn’t it be much simpler if I could just launch a 911 app that would connect to a server on the Internet (calling the Internet the “cloud” will get you smacked) and report my location? It could even display my native language (based on the locale of the phone) so the operator would have that information, and it could also stream video and audio in order to give the PSAP accurate information to judge the proper response. For example, it would be silly to send fire trucks and multiple cars to a fender bender on a side street, but that might be warranted for a crash on the interstate with possible injuries and the need for traffic management.

I am not vain enough to think I came up with this before anyone else, and a quick search shows that there are a number of companies working on this, but it doesn’t seem to have been adopted much. Redsky appears to have offered one as far back as 2010, but there wasn’t a video component, and another website I checked appears to have been hacked and defaced, and no one has noticed, so I don’t think this is a high priority for anyone at the moment.

This seems to be a great project for an Open Government initiative. Using open standards, we could probably easily build the app and then permissively license the server side libraries so that they could be embedded in current PSAP offerings.

Seems like a cheap win, but knowing how slowly the 911 infrastructure changes (e.g. think glacier) I don’t see it happening soon unless people start asking for it.

Goodbye Cyanogenmod, I'll miss you

It is with some disappointment that I read of Cyanogenmod’s descent into fauxpensource. Not only does it appear that they are doing everything they can to ruin any credibility with their community, it also means that I need to find a new operating system for my android devices.

For those who don’t know, Cyanogenmod is was a very popular implementation of the Android Open Source Project (AOSP). Basically, it is a recompiled version of the software Google and others distribute with their phones but the aim of AOSP is to be as open source as possible (i.e. without a lot of proprietary add-ons). If you were to buy, say, a Google Nexus 4 and a Samsung S4, both android phones, you would find that the user interface on both is radically different.

The reason is that it is rare for a company to want to sell commodity products. If the software on android devices were the same across them all, price becomes the main differentiator. If you are a device maker aiming to get the same margins that Apple is able to demand for its products, then you want to add something unique that isn’t available elsewhere, and it is hard to do that under the open source model. Also, the traditional way to offset costs is through deals to bundle other products into your offering. Does anyone here remember buying a retail computer with Windows installed on it? Usually the desktop would arrive full of pre-installed software, or “crapware”, that the vendors paid to have ship with the product. This happened when I bought my Galaxy S3. I tried to remove all of the kruft, such as the Yellowpages app, only to have the operating system tell me that it was a “critical” system app and couldn’t be removed.

So, within two hours of getting my phone I had root access and installed Cyanogenmod.

Now, I have struggled for over a decade to balance the desire to create free and open source software with the need to make money. I can understand the pressures that the Cyanogenmod team must have felt watching their buddies at commercial software companies making large salaries with a decent amount of job security while they toiled along with no real business model. I, too, have heard the siren song of Venture Capitalists who believe that all you need to make a lot of money is to offer some sort of “enterprise” or commercial version of your open source project.

Most of them are wrong.

I was in a meeting with a VC a few weeks ago when this came up. Now you have to realize that there has only been one “Valley-grade” success story with open source (well, that still exists as a private company), and that is Red Hat. However, most in the Valley don’t view it as a success, and I think that is mainly because it wasn’t a Valley deal. The first thing the VCs will say is that Red Hat is too small – it’s not a “real” success – when the fact is that they have a market capitalization similar to Juniper Networks (about US$10 billion). The second thing is that they’ll point out that Red Hat has “an enterprise version”. This is also not true. Red Hat sells time, just like we at OpenNMS do, through support and ease of use. If I want to, I can buy that access, take the product, remove all of the trademarked information and create an open source, feature for feature copy. This is exactly what CentOS does and why I call the measure of whether or not a company’s products are truly open source the “CentOS Test“. The main reason that the Valley has been unable to duplicate Red Hat’s success is that they always undermine it with some sort of commercial software component that removes the reason people would use it in the first place.

Take Eucalyptus for example. They tout themselves as an “open source” cloud solution, but the barriers they erected with their commercial offerings caused the creation of OpenStack – a truly open source solution that in just a few years has easily eclipsed their product. In that same VC meeting the guy asked “yeah, you’re open source, but what is the ‘secret sauce’?”. Well, the “secret sauce” is the fact that OpenNMS is open source. If I were to screw with that we’d stop being a market leader and just become one of many hundreds of commercial offerings, despite any features that make us better than them.

“But,” the open core people will exclaim, “we need to make money.”

One way to make money is to dual-license an open source project. In order to do that, one must own 100% of the copyright. This brings us to the contentious topic of copyright assignment, and Cyanogenmod seems embroiled in this issue at the moment.

I think it was MySQL that pioneered this idea. Their argument was “Sure, you can contribute to the project, but we need you to assign the copyright to the code you wrote to us. Thus, we can offer it under a license like the GPL, but if you want to pay us you can use it under another license.”

In theory this is a great idea, but there are two flaws. The first is that, as a programmer, if I were to create some code and then give away my copyright, then I no longer own what I wrote. Imagine that you wrote some code for MySQL, and, I don’t know, the company gets acquired by, say, Oracle, and you decide you’d like to work on that code for MariaDB. You can’t. You gave it away. You no longer own it.

The second flaw is that when a company makes a commercial offering, the pressure is on to add more stuff to it and leave it out of the “free” version. MySQL started down this path with offering new versions to commercial customers six months or so before releasing them under an open source license, then six months became a year, and then became never. This is exactly how Cyanogenmod hopes to pay back that $7 million investment by requiring device manufacturers to pay for features that they plan to keep out of the open source version.

OpenNMS, I think, has avoided these two traps. First, we do require copyright assignment. One main reason is that we need to be able to defend OpenNMS from people who would try to steal it. This happened a few years ago when a company was using our code in violation of the GPL. When we started legal action to make them stop, their defense was that “if” they were stealing the code, they were stealing from OpenNMS 1.0 (which at the time we didn’t own the copyright) and thus we couldn’t defend it. Myself and David Hustace mortgaged our houses to acquire that copyright and were able to bring the existing OpenNMS code under one copyright holder.

The next problem to solve was future contributions. Instead of unilaterally declaring that we get sole copyright to all contributions, we actually bothered to ask our community for suggestions. DJ Gregor pointed out the Sun Contributors Agreement (now the Oracle Contributors Agreement) which introduced “dual copyright” to the software industry. In much the same way two authors can share copyright on a book, it is possible for a code author to contribute the copyright to their code to a project while retaining the rights as well. We adopted this for OpenNMS and everyone seems to be pretty happy with it.

Now the second issue, that of a dual license, is harder to address. In the case of OpenNMS it comes down to trust. Trust is very important in the open source world. When I install a pre-compiled binary I am trusting that the person who compiled it didn’t do anything evil. Mark Shuttleworth came under fire for implying that Canonical “had root” in response to some questions about Ubuntu and privacy. While the statement was a little harsh in light of the valid concerns of the community, it was also true. We, as Ubuntu users, trust Canonical not to put in any sort of backdoor into their binaries. The difference between that and commercial software, however, is that it can be verified and I have the option of compiling the code myself.

At OpenNMS we promised the community that 100% of the OpenNMS application would always be available under an open source license, and we have kept that promise. In fact, when Juniper (one of our “Powered by OpenNMS” customers) licensed the code, all the additional work they contract from us ended up in OpenNMS as well (you can actually see the code we are working on in feature branches in our git repository). This is a great way to make money and advance the project as it can be used to pay for some of the development.

This is not a plan that Cyanogenmod plans to follow, if the experience of Guillaume Lesniak is any indication.

The only reason I was interested in Cyanogenmod was the fact that it was open source. Now, the beauty of it is that open source almost always offers options. Bradley Kuhn, a person I consider a friend and whose blog post pushed my button to write this in the first place, offers up Replicant as an alternative. I hadn’t looked at that project in awhile and it seems to be coming along nicely, with a lot of newly supported devices. Unfortunately my AT&T S3 isn’t one of them (they only support the international version), so I’m looking to switch to AOKP as soon as I can find the time.

It will be interesting to revisit Cyanogenmod in a year. My guess is that anyone not employed by Cyanogenmod, Inc. will flee to other projects, and Cyanogenmod, instead of being the go-to AOSP alternative, will fade into just another commercial offering. It is doubtful that Samsung will license it, since they pride themselves on in-house expertise, and Google is, well, Google. With the exception of HTC, no one else has any market share.

But, what do I know, right?

Ingress Revisted

As I was washing the dishes this morning, I started thinking about Ingress.

Ingress is a massively-multiplayer geolocation game by a division of Google. In it, players must move around to different locations and interact with “portals”.

While it is a fun game, I’ve been having this nagging thought about why Google would spend time on it in the first place. Then it dawned on me: advertising.

A lot of speculative fiction writers, like Paolo Bacigalupi, have predicted a future in which we run around with augmented reality headsets (a lá Google Glass) and information is overlaid on top of what we actually see. While it sounds all well and good, creating such a system is non-trivial. You would need to have something that knows your location pretty accurately, which way you are facing, and can present some sort of icon or image with which you can interact and be able to do this for lots of people at the same time.

Sound familiar?

I think this is the real reason Google is spending time on Ingress and why they aren’t bothering with an iOS port. What better way to get hundreds of thousands of people to volunteer to beta test your next generation platform?

I think it’s brilliant.

Mint with a Dash of Cinnamon

Since switching to using Linux as my primary desktop, I’m always curious as to what options are available to me besides my default go-to distro of Ubuntu.

While Ubuntu 12.04 (the LTS version) is one of the best desktop operating systems I’ve ever used, I’ve grown less enchanted with each subsequent release since then. Part of it comes from some of the choices made by the Ubuntu team (such as the tight integration with Amazon) and I can work around most of those, but I’ve had numerous stability issues with Unity that didn’t really exist in the older releases.

When Debian wheezy came out, I decided to give it a shot as a desktop operating system. I’ve used Debian as a server O/S for over a decade, but the main thing that makes it great for servers, the cautious nature of changes and inherent stability, kind of suck for the desktop. I’ve discussed this with Eric, who is both a Debian user and a Debian committer, and his reply is to ask if you really need to have umpteen updates to firefox, etc. I can see his point, but if I’m using, say, Gnome, having access to the latest release can have a huge impact on the user experience.

So I didn’t like wheezy as a desktop, but before going back to Ubuntu I decided to check out Fedora. It does support Gnome 3.8, but I ran into another issue that affects almost all distros outside of Ubuntu, which is the ability to easily encrypt one’s home directory.

Ubuntu, in the install process, let’s you choose to encrypt your home directory. While I’m firm believer in xkcd’s interpretation of what would happen in the case of someone wanting access to my data, I still like to take some precautions.

I don’t like whole disk encryption for a couple of reasons, namely the possibility of a performance hit but mainly the fact that I can’t remotely reboot my computer without having someone at the keyboard typing in a passphrase. I figure encrypting /home is a good compromise, especially since the key can be tied to the user’s login via pam.

I tried to get this to work on wheezy, but I found the performance was spotty and sometimes I’d login only to find nothing in my home directory. I didn’t spend too much time on it, since I was eager to use Gnome 3.8, but was disappointed to find that Fedora didn’t allow one to easily encrypt their home directory either.

Before giving up, I decided to take a shot a Arch Linux. I’ve been hearing wonderful things about this distro at conferences, but the installation process taxed even me. It it seriously bare-bones, but that it supposed to be part of the appeal. The philosophy around Arch is to create a distro with just the things you, the user, want and with access to the latest, greatest and, supposedly, most stable code.

It appealed to me as a great compromise between Debian and getting the latest shiny, but I couldn’t get it installed. You end up having to create your own fstab and somehow the UUIDs got screwed up and it wouldn’t boot. It also didn’t support the encryption of the home directory as an option out of the box, but I was willing to try to create it as I did under Debian if I could get it up and running. I don’t think it was impossible for me to get working; I simply ran out of play time and decided to try Arch another day.

On my way back to Ubuntu I decided to try one more distro, Linux Mint. I never made it back to Ubuntu.

Linux Mint 15 is a fork of Ubuntu 13.04. It removes some of the choices made by the Ubuntu team that raise the hackles of privacy advocates, and it introduces its own desktop manager called Cinnamon.

I quite like it.

I can’t really say what I like about it. It’s pretty, with the exception of the default desktop background (seriously Mint, yeah I know there’s history there but, sheesh) which is easily changed. The Terminal theme is one of the nicest I’ve used. There’s a pop up menu like Gnome 3, but then there’s these little dashlet thingies that let you launch things quickly, and a notifications system that is easy to access without getting in the way.

Running applications and open windows show up in a bar, like Gnome 2 or Windows, but I don’t find myself using that all that much. It is pretty easy to customize the whole thing, such as changing the location of things as well as setting hot corners.

There are a couple of issues. The menu doesn’t seem to index everything like the Dash in Unity, and I had gotten used to just typing in a few characters of a file name in order to access it. It does seem to remember files you use, so once you have accessed a particular file you can find it via the menu, but it does impact workflow not knowing if it will show up or not. The other issue is that it is still bound to Ubuntu, so they have some common bugs.

For example, I use the screenshot app a lot. Under Ubuntu 12.04, when I’d take a screenshot a dialog would appear asking me to save it. A suggested filename, based on timestamp, would be highlighted followed by the .png extension. I could just start typing and it would replace the highlighted text with what I had typed. That got broken in 12.10, so I’d have to reselect the text in order to set the filename. Not a big deal, but a little bit of a pain.

When I switched to Mint, it had the same issue. Note: in the last day or so it seems to have been fixed, since I am not seeing it as of today.

Of course, you get a lot of the Ubuntu-y goodness such as encrypted home directories out of the box with Mint, but Mint may end up being on the winning side of the Wayland vs. Mir argument, since Cinnamon isn’t tied to Mir (or Wayland for that matter).

For those of my three readers with a life, you may not be aware of either of those projects. Basically, for decades the control of graphical displays on most computer screens is based on a protocol called X11. Under Linux that implementation is currently managed by the X.Org project, a fork of the Xfree86 project that was the Linux standard for many years. The next generation display server arising out of X.Org (well, at least many of the developers) is called Wayland, and in the next few years one can expect it to become the default display server for most Linux distros.

Ubuntu, however, has decided to go in a different direction by launching its own project called Mir. I believe this is mainly because their goal of having a unified user interface across desktop, tablet and phone devices may not be easy to meet under Wayland. My very elementary understanding of Mir is that it allows the whole display space to be managed like one big window – easy to resize under the different screen resolutions of various devices – which differs from Wayland, but I could be making that whole part up.

I’m a huge fan of Ubuntu and I believe that those that do the work get to make the decisions, but I also believe that Wayland will have a much larger adoption base, ergo more users and developers, and will thus be more stable and more feature-rich. My own experiences with Unity’s stability on later releases indicate a trend that the first Mir releases will have some issues, and I’ve decided that I’d rather stick with something else.

For the time being that seems to be Mint with Cinnamon. Not only can I get work done using it, the underlying Ubuntu infrastructure means that I can get drivers for my laptop and still play Steam games. I still run Ubuntu 12.04 on my home desktop and laptop, but that is mainly due to lack of time to change over to Mint.

So, if you are looking for a solid Linux desktop experience, check out Mint. I am still amazed at what the free software community gifts me with every day, so my desktop of choice may change in the future, and I’ll be sure to let you know if I find anything better.

Silicon Valley

Ron and I had some meetings scheduled in Silicon Valley last week. It was an interesting trip, so I thought I’d put down a few thoughts.

The trip out was a little painful. Due to storms in Dallas they closed DFW and so our plane got re-routed to Waco. Now the Waco Regional Airport is not the largest in the world (it has two gates) and so they weren’t really set up for handling the few jets that got diverted there, and I’m sure the plan was just to refuel and head back to Dallas when the weather cleared.

Unfortunately, the MD-80 we were on experienced some sort of mechanical issue and it wasn’t getting back to DFW that night. They didn’t announce that publicly (if a delay is caused by weather, the airline isn’t held responsible, but if it is related to maintenance then American would have been responsible for hotels, etc.) and all we were told was that we’d have to take a bus back. I heard about the maintenance issue from the crew, but they wouldn’t give specifics.

We ended up exiting from the rear of the aircraft, something I had never done in years of flying.

It was a little frustrating, specifically because Ron checked a bag. On the plane they told us that he could get his bag if he requested it from the desk, but once we got there we found it wasn’t staffed. By this time we had left the secure area and couldn’t get back to talk with the original person, and later it turns out that the four American Eagle staff decided to hide in the office instead of dealing with questions from our crowd. We were finally told that we couldn’t get his bag and that it would be delivered to San Francisco with our next flight.

I have watched Planes, Trains and Automobiles enough that as soon as we landed in Waco, I called and I booked a room at the DFW Marriott. We managed to get there about 1am, and considering that we were rebooked on a 7am flight we didn’t get much sleep, but at least it wasn’t on the floor of the airport.

Upon arriving at SFO we went to the Admiral’s Club to check on the status of Ron’s bag. They said it had been scanned at DFW and should be on the next plane, which was due to arrive in about three hours time. We decided it was worth it to wait.

It wasn’t.

The bag wasn’t on that flight, the one 40 minutes after it, nor the one 10 minutes after that. American seemed incapable of locating the bag or telling us when it might arrive, and I couldn’t help but think that we could build them a better system using OpenNMS. Heck, the bar wouldn’t be all that high, as pretty much anything would have been better than what they have. That afternoon we gave up and decided to head out and just stop by Target to buy some clothes.

The rest of the trip was much better. We met a friend of Ron’s named Mark for dinner and had a really great conversation about pretty much everything, but with a focus on tech and the business of tech. We then called it a night due to having little sleep the night before.

The next morning while Ron was on the phone with American, who were still having issues locating his luggage, the hotel brought the bag to his room. Resupplied with clothes, we were ready to tackle our now completely booked two days of meetings.

It had been awhile since I was on Sand Hill Road, and it seems that things have changed for the better. Most investors seem eager to at least learn about a company like ours that has both customers and profit, and most of the meetings we took were fun.

One wasn’t. It was the same old tired “If you aren’t in Silicon Valley, you can’t be successful” spiel I used to hear every time I came here. The premise is that if you want tech talent, i.e. a talented Director of Sales, you can only find them in the Valley. This contrasted with another person I talked to this trip who said he was having trouble finding people because no one wanted to go to a Series A startup. With Facebook, Google, Twitter and others hiring, the top guns are either going there for the security and high salaries or are off starting their own companies.

I couldn’t help myself (it happens) and I had to point out that in the case of OpenNMS being focused on open source, there is more talent in RTP than in California. Red Hat’s revenue is over a billion dollars annually, and I would like to see the Valley’s equivalent. With all that talent ‘n such there should be several companies, right?

Didn’t think so.

On the flight back I was seated next to a woman who was a bit of a hired gun in business consulting and she pointed out that quite a few Valley startups take off like wildfire but then quickly plateau. Her theory is that the area is very insular so business plans tend to target companies in that area and they don’t do well outside of it. I think there is a grain of truth in what she said, although there are notable exceptions such as the companies I named above.

The one thing that is hard to recreate is the sheer density of interesting people. Perhaps it was because I’m now traveling with Ron who knows everybody, but I had some great conversations, one after another. I have had conversations of a similar level in Raleigh, but not in a row like that.

But I am willing to experience that via airplane versus living there. Spending over a million dollars for a small house and then having to deal with the traffic, parking and other issues is enough to make me appreciate my current standard of living. Plus, I would have to have a really nice job to afford the Telsa sedan which seems to be the car of choice in the area. At one point in time we were passed by two red ones on the 101 (one with a dealer tag). I did see only one coupe but the sedans were everywhere.

We’re off for meetings in other parts of the country (and world) over the next few weeks, so it will be interesting to compare that to my trip West. I’ll try to post my thoughts so that my three readers can experience the wonder that is business travel from some place that isn’t Waco.

Open Source Activist

I’ve known Lyle Estill for over a decade now, and I consider him a friend. So when I got a copy of his latest book Small Stories, Big Changes it went right to the top of my reading list (which is literally two feet high at the moment).

Leafing through the book I realized something: I was mentioned no where in it. As someone who has been at least mentioned in almost all of the other books by Lyle, I braced myself for disappointment. I mean, how can it be good without a little dose of Tarus? Fortunately, that didn’t happen, and I found it a good read.

I think the oversight of my omission is that Lyle didn’t do most of the writing for this book. He compiled stories by others, including a number of local people, into a book about “the frontlines of sustainability”.

What is sustainability? One author defines it (via the United Nations) as “the ability of the current generation to meet its needs without compromising the ability of future generations to meet theirs”. This book includes people in the energy business (solar and wind), those who aim to live simply, and others who focus on reducing waste.

Each of these authors have their own voices, but they are tied together by introductions by Lyle, and his relationship to them: some close, some distant, brings the book together.

The hardest chapter for me to get through was the one by Gary Phillips.

Gary Phillips is local to Chatham County, North Carolina, and he is probably most famous for being on the receiving end of one of the dirtiest campaigns ever run for County Commissioner, in our county or any other.

For a long time North Carolina politics was a bit of a dichotomy. Nationally, the people here had the tendency to vote Republican, but locally it was all Democrat. This has changed, but back in 2002 the winner of the Democratic primary was the ultimate winner.

At the time there was a huge push for development in the county, but for a number of valid reasons (lack of water, water treatment facilities, and other infrastructure being key) the tradition was for a more “slow growth” attitude. A businessman named Bunkey Morgan changed his voter registration to Democrat and “rented” a house in the proper district just to run against Phillips. He won by 320 votes and implemented policies that resulted in Chatham becoming “Zombieland” after the housing bust.

But while Phillips blames his loss to Bunkey Morgan’s carpetbagger strategy to “not being white enough,” the biggest thing I remember from that election were the changes he made to his marital status mid-campaign. I think that, as much Bunkey’s tactics, cost him the commissioner’s seat.

While that chapter didn’t resonate with me, others did. I do like the fact that some of the author’s touched on my pet issue with respect to sustainability, namely population control. Many agonized over the “energy equation”, in other words if I buy a hybrid does the environmental savings in fuel offset the damage caused by mining for the rare earth elements for the car’s batteries? The one thing that was true about every author, including Gary, was how seriously and deeply they felt about this planet we share.

I found “Small Stories” to be a solid read, and Lyle is continuing stories of “activists” of all stripes on his website.

He asked me to contribute my own “open source activist” story, and it is now up on the website. Check it out and let me know what you think.

I Lost My Job!

Okay, please forgive the sensationalist title, but it is true: I am no longer the CEO of the OpenNMS Group. That honor belongs to a man named Ron Louks. We have a press release and everything.

When I took over OpenNMS in May of 2002, I had no idea it would become as big as it has, even to the point of outliving the company that started it. I knew my goal for OpenNMS – to make it the de facto network management application platform for everyone – was huge, and the only way to go about it was to heed our mission statement, which is:

Help Customers – Have Fun – Make Money

That, some luck and a lot of sweat equity has seen the OpenNMS Group through nearly nine consecutive profitable years, and we have built a great community as well having the best customers on the planet (in 26 countries, no less). But I knew the day would come where we would need someone with more experience to take the reins to get OpenNMS to that goal, and that someone is Ron.

I’ve known Ron for longer than OpenNMS has been around. He, David and I used to work together, and while the two of us went off to focus on network management, Ron went out in search of huge challenges. He was always focused on the mobile communications industry, and he worked his way up to become the Chief Technical Officer of Sony Ericsson, and then the Chief Strategy Officer at HTC. He managed engineering teams of over one thousand people, and has been responsible for the production of over 200 million mobile devices.

And while he is too modest to point this out, the most successful times in the history of those companies was when Ron worked there.

Ron will be directly responsible for the next phase of OpenNMS. We plan pretty aggressive expansion to better serve our customers, as well as improved Windows support and the introduction of some software as a service products to help our users get the most out of OpenNMS. He is fully on board with my two requirements for the OpenNMS platform: it will never suck and it will always be free.

With Ron’s help we have developed a wonderful business plan that will see some phenomenal growth in the OpenNMS software. Yes, this means the ever talked about but rarely seen, OpenNMS “Nukem Forever” Version 2.0, will become a reality (it is, in fact, a key part of our future). With a focus on a new, state of the art user interface and taking the already impressive scalability of OpenNMS and making it virtually unlimited, OpenNMS 2.0 will position the platform for the coming “Internet of Things”.

But this didn’t happen overnight – Ron has been on our Board since January and we have spent hundreds of hours making sure this is the right thing for us to do. At one point in the process David deferred a decision to me, saying that no matter how long he has worked on OpenNMS, he still considers it “my baby”.

Well, my baby is all grown up and ready for college. OpenNMS, and the OpenNMS Group, has always been much more about the team than me. All I did was shelter and nurture it, and now it is time for me to just take pride in watching the project reach its full potential.

Seriously, any credibility I have in this business is from standing on the shoulders of giants. The only true talent I have is attracting amazing people to work with me, and I plan to put that talent to use as we grow over the next year. While no longer taking the lead on the direction of the company, I have been named the Chairman of the Board, and I have chosen to focus on what I love to do best: help our customers. While we aren’t much on titles, I think mine will read Chief Operations Officer.

But in my heart I will still think of myself as Julie, the Cruise Director, here to make you OpenNMS journey as pleasant as possible.

OpenNMS Gets An Emmy Nomination

Okay, so I’m stretching things a bit. Well, a whole lot. In fact, OpenNMS had nothing to do with the Emmy nod, and it is just a shameless attempt to get your attention.

I believe I have very little natural talent. The one exception is that I seem to be able to surround myself with some of the most amazing people on the planet. They do great things and I just bask in the reflected glory.

I’m not knocking it.

One of those people is our chief architect and CTO, Matt Brozowski. In his copious spare time he manages to do a lot of things, including coaching a program at the University of North Carolina called “Powering a Nation“. Each year students create a documentary involving some aspect of energy use in the United States, and the 2012 team created “100 Gallons: How Water Powers Life“.

It got nominated for a Emmy award.

How cool is that. It would be awesome if they won.

Matt is also coaching the 2013 team, so let’s see if they can go two for two.

Dev-Jam 2013: Day Three

We are now more than halfway through Dev-Jam and the energy level remains high.

Well, at least it looks like everyone is working, as a non-coder I’m not sure exactly what is going on but folks seem to be having a lot of fun. (grin)

I’ve been coming here for many years now, but I never managed to visit the Weisman Art Museum on campus. It is a very distinctive building.

Richard and I decided to go, and it is pretty cool. The building is designed to let in a lot of natural light without any of it directly hitting the pieces. They have lots of paintings, plus some photographs, sculptures and pottery. I especially liked this piece:

but I couldn’t help thinking “four elements surrounding a fifth … a Fifth Element”.

Wednesday is traditionally cookout day, but as we grow it is becoming a lot more work to man the grill.

Between the hamburgers, local sausages and leftover Brasa we had tons of food.

Last year Mike and DJ made homemade Jeni’s ice cream. Now, that was a lot of work, so this year we just outsourced it to Izzy’s, which, while not Jeni’s was still pretty awesome.

And what better way to end the evening than with a little Army of Darkness.