Using apps to do a “Pothole and Poop Patrol”

Setting the stage: Last June I was blown away with an insurance company’s commercial for an IPhone/Smartphone App letting people properly document a car accident in order to help simplify the claims process.

Looking through the newspaper this morning, I noticed yet another example of an otherwise mundane app for smartphones: Apparently, a bunch of American (and presumably other) cities have apps which allow local citizens to collect data including photos and location (usually by but not always gps coordinates) of potholes, and using 3G/Wi-Fi hotspots to report potholes directly to the local public works, saving money by bypassing presumably more expensive operators, field inspectors and the like, as well as saving money by directing workers directly to where work is needed instead of waiting around for the information to trickle through the system. And, essentially, putting crowd-sourcing, or the notion of “many eyes will eventually reveal bugs” to work.

Many such apps also are more general and allow people to report all sorts of things beyond potholes, such as broken lamp standards, water main breaks, and the like.

Beyond being impressed, it made me think back to 2001-2002 when I’d just gotten a gps and started playing geocaching: One of the funny stories that came about in geocaching circles (and no doubt general gps circles) when people were learning about the uses of gps with 3m-8m accuracy involved some groups of people essentially going out on “Poop Patrol”, marking the locations of where they found piles of poop left by — no, I won’t indulge in the joke that just came into my mind and perhaps yours — ok, here it is, poople, who don’t clean up after their dogs. We thought “What, don’t people have better things to do with their lives than go around looking for piles of poop and filling up their gps memory with their locations? What are they going to do with the information and all the waypoints? Chase down and tackle the offenders? What about the local council meetings that will no doubt have people being laughed at during question period when they bring their lists of waypoints?”

Funny, mulling over the “Pothole Patrol” I read about in the paper this morning, the “Poop Patrol” seemed less amusing in the ridiculous sense and more viable as a way of measuring hot spots for increased street cleaning, or identifying dog walking hot spots where perhaps municipalities might consider adding dog runs where they might not have without the “Poop Patrol” data, or adding or reinforcing secondary services such installing bag distributors for dog walkers who forgot their bags to “poop & scoop” mounted on lampposts (and of course filled by conscientious dog-walkers who can bring their excess supplies of plastic bags) as can be found in many dog run parks, or add extra garbage bins in those areas.

Again, along those lines, I’m thinking about geocaching.com, which facilitates “Benchmark Hunting” (in its most basic form, taking the coordinates of USGS benchmarks and going out to hunt them for the pleasure of it, and then logging the finds as well as the adventures along the way on the website.) I bet the USGS takes advantage of the informal “inspections” in some way.

Or how, in about 2001-2002, again when I was starting off with geocaching, I’d registered for an account with Natural Resources Canada to search out Gravimetric Markers, essentially the same as geographic benchmarks but whose purpose and location are related to standard measurements for gravity; I live near one, and there’s one near my cottage, so the idea of doing volunteer inspections along the lines of doing simple check lists seemed like a fun complementary activity to geocaching seemed like fun. Such checklists could contain, say, 5 items on the physical integrity, access, and so on related to the marker which any person off the street could perform on a regular, semi-regular, or sporadic basis and the results of which could be useful to the maintainers so that the responsible body could channel resources to “more important” activities as well as proritize maintenance schedules, as above.

At the same time, I also “noticed” all the Bell Canada telephone switching boxes along the way to the cottage in the same light and thought about doing volunteer inspections, which I never pursued.

Unfortunately home networking issues at the time made accessing my account with Natural Resources Canada difficult and the charm of both ideas fizzled out.

But now, the ideas from “The Poop Patrol” to volunteer inspections of Gravimetric Markers and Bell Canada switching boxes, in the light of “The Pothole Patrol” and taking into account human idiosyncracies and the human penchant for such trivial pastimes, seem less silly …

Handheld computer Apps

Here’s a blog entry I meant to post last June, 2009, but for whatever reason I never got around to it. I’m doing it now to set the scene for my next entry. 🙂

*****

I just saw a commercial for Nationwide Insurance. And I was blown away.

They have this several times mentioned “accident app” for the unnamed but clearly identifiable iPhone / iPod Touch. It brings you through the process of what you need to do if you have a car accident — take photos of the damage, the area, the other car, the street, where the cars are relative to everything, then the address, the license of the insured car, your details, the other car’s and driver’s details … of course all this just from the commercial.

And I’m thinking … that’s the kind of thing a handheld with an integrated camera is for, not just taking pictures of Fluffy, Rover or the kids every five minutes and sending them to friends, or playing some inane game. I was thinking that the Apple App Store, without having checked it out myself, was probably full of useless apps like tip calculators and calorie counters.

I used to have a Palm Pilot. I still have it, in fact, but I don’t use it. You looked all around the net for little apps, and there’d be plenty of useless ones, and you get tired when the one or two useful apps just don’t cut it anymore and in the process required so much effort to find what was (usually) a second-rate app.

Or maybe I am glimpsing at why Ubuntu is doing so well.

I think I’ve been so high on my horse about open source that I’ve missed something.

Or, maybe one of the open source drawbacks is that there isn’t an open-source Linux “App Store” out there, creating the buzz that “you need this, it’s a killer app” or “it’s a killer appliance” creating the desire for the product (and then of course the apps would follow) or whatever (March 2010: Hello Android!)

(addition in March, 2010) Now of course this of course is a really bad observation on a technical level; the Apple “App Store” is a kind of repository for the IPhone and IPod Touch, and there are plenty of linux repositories; and, despite the completely different paradigms between a desktop (and even laptop and netbook) and a handheld device, of course there are plenty of little programs that would be apps for your computer available. Despite its convenience, I wouldn’t want to use my Acer notebook to fill out all the details of my latest car crash, even though it has a web cam in it and wi-fi, and I suppose I could get a 3G dongle for it. My meaning was more along the lines of when I was responding to a survey about a tax program, which asked why I used the version I used, in my case, the web version: I said I used it because I use linux and they don’t have a linux version, and that in order to have a linux version, the best way would be to push it through the repositories or have their own repository, in order to maintain bug fixes and updates in tax laws, meaning that the afore-mentioned buzz is no longer there surrounding a computer (or more) in every house, and the fact is what makes handheld devices so buzz-worthy is the combination of small convenient size and processing power. See next.

(back to June, 2009) My brother, who has an iPod touch, says the big difference between handheld computers today and those of five and ten years ago — aside memory and processing power and the like — is the presence of wifi abilities and hotspots; the inclusion of a camera was implicit to his comments. And nowadays, gps antennas, motion detectors, and the like. Things “that could be done” five and ten years ago just weren’t there of because, well, the instant connectivity — and the integration of connectivity into the applications and related software — suddenly makes it seem like an obvious thing, not just loading the handheld into a dock and syncing it with your desktop.

It’s tax time, and the Government of Canada supports linux!

Doing a bit of research for tax-time, I went to Service Canada’s website to get some extra information needed. I finally figured out how to navigate through some pages, and whaddya know, they support two linux distributions: Fedora (they added, incorrectly, “Core”) 8 — which of course now is out of date — and Ubuntu 7.1, which I suppose was really 7.10. I suppose to some government person who doesn’t quite understand Ubuntu’s version numbering system, 7.1 and 7.10 are “about the same” — of course, were there any validity at all, it would represent the January 2007 release of Ubuntu, which never existed, as opposed to the October 2007 release. 🙂

I was pleased to see them finally picking up the slack, even if this was put in place about 2 years ago. 🙂

And of course, here’s the screenshot, with the appropriate areas highlighted.

Service Canada Supports Linux!

Cool (or mundane) computer trick impresses co-worker

I managed to impress someone at the office this week with a cool (read mundane) computer trick.

I got a call from the secretary, who is a few seconds’ walk from my desk, asking for a scanned version of my hand-written signature. I replied that on my computer at home I have it, and I could easily get it within a few minutes; she replies that it would be faster for her to just walk over with a piece of paper for me to sign, which she would then scan and play around with.

And this is where I began to impress her: By the time she got to my desk with said sheet of paper, I had already VNC’d into my home server’s desktop and was in the process of doing the same from the server to my main computer’s desktop (gotta finish the process of giving it a static IP and setting it up so that I don’t have to go through my home server. 🙂 ) I finished logging into my desktop, and looked in the likely directory, and voilà ! I fired up my home email client, and within a couple of minutes, she’d received my scanned signature.

Beyond the fact that the Gnome desktop is set up standard to do VNC — and the fact that I installed TigerVNC instead of using the standard Gnome Remote Desktop Viewer — too bad that I can’t really claim that this is a cool Linux trick, since my computer at work is Windows, and you can set up Windows boxes to “pick up the phone” too ….

She was still impressed, though. And it took about as much time as the whole process of signing a piece of paper, scanning it, cropping it, etc.

Canola oil instead of petroleum oil car treatments and ethanol blends

I was impressed the other day when I finally got around to rustproofing my car at Antirouille Métropolitain, a chain of rustproofing businesses in Quebec. My car is 13-14 years old and has virtually no rust, although I have to repaint the running board on the driver side yet again, I let things go too long over the past few months so the rust is starting up, but it’s not bad at all. Yet.

They asked me “do you want the traditional oil based treatment or the “bio” treatment? It’s dripless and made of canola oil.”

Apparently the selling point with most people was that it’s dripless, vs. their traditional oil treatment, for which the optimum formula is necessarily drippy. For me the selling point was that it’s canola oil, and the dripless part was just a secondary bonus. This doesn’t affect their usual performance guarantees.

After I’d paid and while the technician is prepping my car and even starting the treatment, I asked the man behind the counter “Aren’t you going to tell your technician to use the canola oil treatment?” To my surprise, he replied that their default policy is to treat cars with the canola oil unless the customer expressly asks for the traditional oil treatment, in which case he would then inform the technician to use “the old treatment”.

The story works out that it took three years to develop the product so that its effects would be equivalent to the traditional oil treatment they developed, and they spent the more two years doing road tests before widespread commercialization of the treatment. They started commercializing the treatment in early 2009. Apparently, the canola oil treatment is the overwhelming choice at this location, as well as business wide to varying degrees — no doubt due to some clever marketing and a highly refined counter-level sales pitch that had me sold hook, line and sinker — to the point that it they sell perhaps one or two traditional oil treatment per week, if that; apparently the principal selling point, as mentioned earlier, is that it’s dripless. In urban centres such as Montreal and Quebec City, this is a big selling point because people don’t like having oil drip marks in their driveways and on their garage floors. In somewhat less urban centres such as Sherbrooke, the adoption rate of the canola oil treatment is down to 40% to 60% apparently because the market, having a larger rural clientele, isn’t as likely to have asphalt driveways or concrete garage floors that would be stained by the dripping oil from their rustproofing purchase, and/or seem slower in changing old habits, such as from the “old” mentality (and old sales pitch) that it being drippy is a necessary side-effect of the formulation so that it can have its maximum effect.

So I was quite impressed that the market is slowly shifting away from some “old fashioned” treatments. Now let’s hope that the rest of the formulation doesn’t outweigh the benefits of replacing the petroleum components.

Note that for the past few months I’ve also been making a point of buying gas from Sonic since they seem to be the only mainstream chain of gas stations in Quebec, or at least in the Montreal area, that sells ethanol blends (6%-10%); they also sell biodiesel blends. Sometimes I go really out of my way or plan routes to pass near a Sonic, but usually not much since there happens to be a Sonic minutes away from home. The other Sonic I occasionally frequent is near Drummondville when I happen to be driving that way. There is another along the way west towards the end of the island. Apparently there are a few other gas stations — I presume independents — who also sell methanol blends in my area, although I have yet to locate them.

This part about the gas has been quite the reverse culture shock from Ottawa, where it’s (or was about 12 years ago when I worked there) the unusual case that a gas station either doesn’t sell ethanol blends or at least isn’t within a couple of blocks of one that does; it’s taken me over 12 years to finally get back to making a point of using the ethanol blends.

Now only if the ethanol blends were more available, and the blends were higher; however, a quick check on Wikipedia suggests that most cars with standard gasoline engines can only tolerate up to about 10% ethanol without some kind of adjustment.

PDF’s, Scanning, and File Sizes

I’ve been playing around with PDF’s for the past few weeks and have noticed a very interesting thing: A PDF is *not* a PDF is *not* a PDF is *not* a PDF, ad nauseum, and it would seem, ad infinitum. At least, so it would seem. Part of me almost wonders if the only distinguishing feature of a PDF is the .pdf extension at the end of the file. In “researching” this post I have learned what I knew already; PDF boils down to being simply a container format.

Lately I have been scanning some annual reports from years past for an organization I belong to, and due to the ways xsane 0.997 that comes with Fedora 12 scans pages — which I will concede straight out of the gate I have only explored enough to get it to do what I want and to learn how it does things “its way” — the PDF file sizes are “fairly” large.

In order to find this out, I first found out about one of the quirks in xsane 0.997: Something about the settings with xsane doesn’t have it stop between pages for me to change pages; at least, I haven’t gotten around to finding where the settings in xsane are to have it pause between pages. This is important because my scanner doesn’t have an automatic page feeder. The first page of results of a google search indicate several comments about this problem, but not a solution. At first glance the second page of results is of no help.

So I end up scanning pages one at a time, and then use GhostScript to join them all up at the end to make a single PDF.

Without having added up file sizes, it was obvious that the total size of all the scanned pages at 75 dpi and in black and white was sufficiently larger than the single PDF with all the pages joined. This did not bother me since, again without having added things up, the difference didn’t seem *too* great, and I assumed that the savings were principally due to adminstrative redundancies being eliminated by having one “container” as opposed to having 25 to 30 “containers” for each individual page.

Then this week a curious thing occurred: I scanned a six page magazine article, and then separately, another two page magazine article, at 100 dpi and colour, and whaddya know, the combined PDF of each set is smaller than any of the original source files. Significantly so. In fact, the largest page from the first set of six pages is double the size of the final integrated PDF, and in the case of the second set of two pages, each of the original pages are triple the size of the combined PDF. I’m blown away.

Discussing this with someone who knows the insides of computers way more than I, I learn something: It would appear that xsane probably creates PDF’s using the TIFF format (for image quality) as opposed to what I imagine Ghostscript does when joining files, which would seem to be to do what it can to reduce filesizes, and as such in this case I imagine convert the TIFF’s inside the PDF’s into JPEG’s. A bit of googling indeed appears to associate tiffs and PDF’s when it comes to xsane; indeed a check on the “multipage” settings shows three output file formats — PDF, PostScript and TIFF. And looking in Preferences/Setup/Filetype under the TIFF Zip Compression Rate, it’s set at 6 out of 9.

So I google PDF sizing, and one result led me to an explanation of the difference between using “Save” and “Save As …” options when editing a PDF: “Save” will typically append metadata on top of metadata (including *not* replacing the expired metadata in the “same” fields!); “Save As”, well, that’s what you really want to do to avoid a bloated file since all that should be will be replaced.

Another result begins describing (what is no doubt but a taste of) the various possible settings in a PDF file, and how using a given PDF editing application, you can go through a PDF, remove some setings, correct others, etc., and reduce the size of PDF’s by essentially eliminating redundant or situationally irrelevant — such as fields with null values — information whose presence would have the effect of bloating the file unecessarily.

I’ve known for a few years that PDF’s are a funny beast by nature when it comes to size: For me the best example by far used to be the use of “non-standard fonts” in the source file, oh say any open-source font that isn’t in the standard list of “don’t bother embedding the font since we all know that nine out of ten computers on the planet has it”. In and of itself this isn’t a problem; why not allow for file size savings when it is a reasonable presumption that many text PDF’s are based on a known set of fonts, and most people have said known set of fonts installed already on their system. However, when one uses a non-standard font or uses one of the tenth computers, when one constantly creates four to 6 page PDF text documents ten times the size of source documents, frustration sets in; having wondered if designating a font substitution along the lines of “use a Roman font such as Times New Roman” when such a font is used — such as in my case, Liberation Serif or occasionally Nimbus Roman No9 L — I asked my “person in the know”. Apparently, Fedora 12’s default GhostScript install, whose settings I have not modified, seems to do just that.

I guess what really gets me about this is how complicated the PDF standard must be, and how wildly variable the implementations are — at least, given that Adobe licences PDF creation for free provided that the implementations respect the complete standard — or more to the point, how wildly variable the assumptions and settings are in all sorts of software when creating a PDF. I bet that were I to take the same source and change one thing such as equipment or software that the results would be wildly different.

So, concurrent to the above scanning project, I happened to experiment with a portable scanner — a fun challenge in and of itself to make it work, but it did without “too much fuss”. And I found out something interesting, which I knew had nothing to do with PDF’s but (I presume) rather with scanners, drivers, and xsane. I tried scanning some pages of one of the said annual reports with the portable scanner on an identical Fedora 12 setup using xsane, and the PDF’s that were produced were far greater in size than those scanned with my desktop flatbed scanner. My flatbed scanner would scan the text and the page immediately surrounding the text, but correctly identified the “blank” part of the page as being blank, and did not scan in those areas, thereby significantly reducing the image scanned size. The other scanner, a portable model, did no such thing and created images from the whole page, blank spaces rendered, in this case, to a dull grey and all, thereby creating significantly larger PDF files than the scans of the same pages created on my flatbed scanner. However, as I mentioned, I assume that this is a function of the individual scanners and their drivers, and possibly how xsane interacts with them, and in my mind is not a function per se of how xsane creates PDF files.

Another interesting lesson.

AT&T does it again! (AKA Will I Ever Learn?)

So I just turned on my TV and here’s a commercial … family dinner … It’s Mom’s tablecloth … Back in the day my grandmother made this for me, they don’t make them like they used to anymore … pass the spaghetti … OOOPS! — NO, WAIT! Don’t do anything!

And they all naturally go to the net to look for a solution (peroxide and something else, everyone in internet cafés and schools around the world yell at their computer screens.) And what does the computer screen look like?

A vague resemblance to the Gnome desktop under Ubuntu, with the white toolbars on top and bottom with hints of brown here and there, but it’s just a touch too blurry to identify it as anything other than NOT Windows, and that it’s probably MovieOS.

I guess that every time they shoot a commercial, the geeky “I use linux at home, I’d love to have the bragging rights to *that* computer in the TV commercial” IT guy in the back is on their day off, or they don’t want to give Gnome or KDE a financial nod. Yet they want to go to the trouble of avoiding an MS or Apple desktop. Interesting.

(sigh …)

19 months, 16 *successful* installs

I just did a tally of all the installs I’ve done on my personal systems since the end of June, 2008, when I bought a new-to-me desktop and took advantage of the opportunity to upgrade from the CentOS 4.x series (to the CentOS 5.x series. 🙂 ). And I was a bit blown over; unfortunately, not surprised, but blown over nonetheless.

Over 5 systems, I’ve done 16 successful installs; then there were a few dud installs that had to be restarted right away, although a couple of said duds were counted because the installs were actually useful for a few weeks, including one not counted as a successful install during the most recent cycle despite the fact that it was a successful install; unfortunately, the boot sector on the drive died (it was to be expected, back last June or July, Palimptest identified the drive as having a fatal error on it, and the drive was declared as having about 6 months to live, and whaddya know!) so I had to get another “new” drive, which I happened to have handy, and do another install.

To be fair, there has been one factory sealed new system thrown in there (what fun to wipe the Windows install, which curiously, apparently irreparably froze up after all the updates were done, the whole thing to be able to say “yeah, but Windows worked on it!” — which it didn’t!), another system that just about hasn’t been used since and after a few months has now been removed from the upgrade cycle, another system that finally died or at least on whose ghost I have given up, and a finally a replacement system for said “death in the family”.

One of the reasons why I always say “I’d love to go back to CentOS if it weren’t so hopelessly obsolete” is that it’s stable and has a long life to it (something like typically 7 years) — Fedora *has* been good to me since I started using it from version 9, and hence with CentOS you don’t have to upgrade every six months like with Fedora — oops, that’s every 12 months or so — given the support cycle (wink wink). 🙂

Problem is that when you have several systems, you’re still doing installs every 6 months or so if the systems aren’t in sync with each other; further one of the consequences of using second hand or third hand computers, buying new computers, upgrading parts and hard drives, and even trying out another distro at least once is that your systems are hardly every likely to be in sync for the whole 13 months or so lifespan of a new-version-released-every-6-months distro like Fedora. And of course, that someone who would like to avoid having to do new installs every 6 months is going to upgrade a system that is out of sync to bring it in line with the others in the hopes that “this will be the cycle when I get to enjoy the whole lifespan and not have to upgrade 6 months from now”.

Hence the ideal of trying to avoid the “install every 6 month habit” by syncing the installs with each other when a single new install is done in order to hopefully avoid having to reinstall in 6 months is fallacious when you have at least two systems — in fact, you end up doing the opposite since you not only are installing (or re-installing) at least once every six months for one legitimate reason or another, but you end up doing multiple installs, many of which are unnecessary in and of themselves, every 6 months, just to keep everything in sync. And as such, the “install every 6 month habit”.

Of course, I have often been enjoying the process despite myself; in fact, I’ve managed to put together an ever-increasingly long list of steps to take from start to finish when installing a system (which I’ll be presenting to one of the local LUGs in a few weeks.) Fortunately, my computers are purely home desktops or hobby servers without any critical processes on them, and my brother at least humours my habit by doing those little bits that are still beyond my ever-increasing sysadmin skill set (which of course is growing with each install cycle). And in the process I’m gaining a practical appreciation for what I’ve known all along since I started using Linux in 2006 and started using CentOS: The likes of Fedora and Ubuntu may be great, but you have to re-install every 6 months! Who wants to do that?!?!” (Apparently, I do. 🙂 )

It must be interesting having multiple production servers with multiple versions of a given distro, let alone more than one distro (ie. a mix of CentOS, Debian, SuSE, and for some good fun, Slackware). Good thing that usually having “the latest and greatest” usually isn’t as particularly important on a server so that it can actually have a useful life. Must be hard for the likes of Red Hat, for instance, when it must add new drivers all the time, but in order to keep from breaking compatibility and adding “bad change” into the distro other things don’t happen (things like the HPLIP version that is one incremental subversion or whatever it 0.x increments are called behind the minimum requirements for my 2.5 year old printer, and which has since gone through several such incremental upgrades and at least a whole version upgrade since.)

News Flash — Linux spotted in the wilds of Montreal!

This morning I did something very unusual, for me. I took the commuter train into work instead of driving my car, and I saw a Gnome desktop on someone’s laptop computer! Doing a double take, I checked, and whaddya know, it’s definitely a Gnome desktop, it’s very familiar, it isn’t brown, and yup, it was Fedora 12.

A few weeks ago my brother had posted on SlashDot asking if anyone had seen Linux in use in the wild — not data centres, of course, nor at LUG meetings or other such gatherings of Linux types where of course Linux is expected to be seen, but random, innocent spottings in places like at restaurants, café’s, university or college student halls, on the streets, on the train, etc. The responses were an underwhelming (or disappointingly overwhelming) “no”. In fact, my brother said that I was the only person he knew who used a Linux desktop besides himself, and that I’m far more pure about it than he. (In fact, he uses Windows as regularly as Linux on his personal systems, while I “only” use Windows at work, and don’t particularly care for it.) Besides seeing Linux desktops at LUGS and Linux Meetups, offhand I can only think of two people I know who say they use Linux at home as their desktops.

I started chatting with this person, and they apparently develop software for a particular industry (no, not that industry), to be used on Red Hat 5.x servers; they use Fedora because CentOS is hopelessly out of date for things like wireless support on his computer; however, unfortunately, they have been finding Fedora 12 unstable … not my experience so far.

Suffice it to say that even put aside the Fedora part, this chance meeting made my day!

Fedora 12 installed — I’m a linux addict with an install every 6 months habit

Well over the past couple of weeks I’ve just installed Fedora 12 on three systems — mainly because I got a great great great new P4 3.0GHz home server, which I have been considering using as my desktop while using my current desktop as the server, a P4 2.8GHz.

To my dismay I have done this 6 months after I made a point of having the same version of Fedora on all my computers so as to avoid the “reformat a system every six months” treadmill that I was on by having different versions, because, well, my old server died and of course there was no point to putting a 6 month old version of Fedora on which I would only *have* to change 6 months from now, anyway … Sigh, the bliss of using CentOS, were only it not so completely obsolete, I would love to use it again … However on the other side, Fedora is the crack cocaine of “latest and greatest”, so for the moment there’s no going back!

All of this started back in, what, September, if not before; I couldn’t get the 80 gig drive and the 500 gig to play nice together, or so I thought. There *was* an issue with different spin speeds, but wait folks, there’s more. When I *did* have the 500gig as the boot disk, something seemed off with the amount of available storage, although I wasn’t fully aware. When I finally brought the 500 back as the boot drive, the installation went well several times with Fedora, then with Centos 5.1 (which would have been promptly updated upon reboot.) Except, the first reboot wouldn’t work, the system would freeze, and the keyboard buffer would fill up real quick. Forums were of little help, with sufficient dead ends and apparent red herrings. Finally, I started figuring out on my own that the BIOS was way too old to recognize such a large drive, and flashing it with a “new” bios would have required a lot of fun with WINE, which I wasn’t really wanting to get into using a live CD.

Christmas and a new server came along, and I’m up and running with a desktop upgraded from june 2008 — CentOS 5.1 to 5.2, July 2008 some version of Ubuntu, December 2008 Fedora 10, July 2009 Fedora 11, and now January 2010 Fedora 12 … and a netbook, a laptop which is no longer used, an old server, and a new server following a similar route for much of the way each. So much for even taking advantage of Fedora’s “1 month after the release of the second version following” … I’m still upgrading every 6 months!

As a result, though, I finally now have refined my “to-do” list when installing a machine so that it’s not so much of a hassle, and in fact two of the three setups were not only a breeze in and of themselves but the to-do lists also made the rest of the work a breeze, too. Of course my brother told me two years ago that his list was 300+ steps long and he’d found a two year old such list, that was only about 120+ items long. My list is currently somewhere around 58 items long depending on how you count it … I wonder how long it’ll take to get to 300? 🙂

However, I had problems with the desktop right after it was installed like a breeze, the disk boot sector died (I expected it would anyway as of about 6 months ago) and funnily enough the mem stick on which the setup worked like a breeze before suddenly wasn’t cooperating. Gotta figure out what was going wrong with UNetbootin creating the ram stick images from ISO, in which, curiously, the boot image required after the disk formatting in Anaconda wasn’t being properly copied or at least activated on the ram stick.

Anyway, I think I have to work on getting the most out of the system, I bet that months from now Fedora will find a way to make me upgrade again, with that lucky number associated with it and all … 🙂