malak.ca — Server Hard Drive Upgrade

This is just a little note to mention that malak.ca has been down for the past 28 hours or so for an upgrade only planned as of a few days ago, when the site had been hanging for anywhere from a few hours to a few days, and diagnostics suggested that the hard drive may have been on its last legs.

A new 256-gig SSD was ordered and installed in the IBM ThinkCentre I claimed from a pile of computers to be shipped off for disposal in 2017 but only actually started using in 2020, with the intention of essentially setting up things as they were beforehand, with only an under-the-hood change of technology.

Here are a few highlights:

  • A backup of the blog database was created, and saved on an external drive;
  • The external drive, used as a backup for my other computers and the location of the static parts of my website, was separated from the machine, which was then powered down;
  • The old hard drive was physically removed;
  • The SSD was connected;
  • Fedora 34 workstation, which had been previously downloaded and installed on a USB key, was installed on the SSD yesterday evening (I’m currently still running on F33 for my desktop, laptop, and one of my worldcommunitygrid.org nodes)
    • The desktop for F34, on the core 2 duo, is faster, although some of that is due to the SSD, of course;
    • Interesting to see the dock moved from a vertical position on the left to a horizontal position at the bottom;
    • I find it interesting that at bootup, the activities screen appears to be the default;
  • This evening, the web server was installed;
    • Although we had planned to use php-fpm to separate permissions, but since this is a single domain box, we used a simple virtualhost;
  • MariaDB was installed;
  • The re-registration of my redirections for things like www.malak.ca with noip.com to account for the dynamic nature of my IP address was done;
  • The re-registration for my Let’sEncrypt was performed;
  • Various linux kung-fu tricks were performed, and magical linux incantations were uttered, and the setup was complete;
  • The external drive was reconnected;
  • The blog was restored from a backup.

The system is peppy, and this blog, which is hosted on the SSD instead of the external drive (as is the rest of malak.ca), loads somewhat more quickly.

As usual, great thanks go to my brother whose herculean efforts were at the core of the setup. Thank you!

Dumpster Diving for Old Computers

To paraphrase Forrest Gump’s mother, “Dumpster diving for computers is like a box of chocolates … you never know what you’re gonna get.”

Over the past at least twelve years, I have been salvaging computers I have found on the streets on garbage day, or found in other locations where my various personal travels have taken me, for use to reformat into usable computers. The various finds have served as main desktop computers, secondary computers, home servers, computation nodes for the World Community Grid, gifts to my brother or the occasional friend, and the like. It has variously allowed me to indulge in a bit of tinkering, trying out a new linux distro or version of BSD, build a home server, or just pass the time while engaging in a hobby.

In the process, I’ve watched the lower bar of what is acceptable “junk that isn’t junk, at least not yet” move upwards from about P4-533 MHz 32 bit processors to dual core 2.66 GHz 64 bit processors (although single core 64 bit P4 at 3.4 GHz to 3.8 GHz range is good if you don’t want to depend on a GUI, or if you have a lot of RAM and an SSD), 512 MB of RAM to 2GB of RAM, and 20GB hard drives to 80GB hard drives. Now it seems that the next big thing will be in moving from mechanical drives to SSD drives, which I expect — when SSD drives become common in the old computers I find being thrown out — will make a revolutionary change upwards in speed in low end hardware, the way I learned the same in 2017 when I swapped out the mechanical drive in my laptop and replaced it with an SSD. (To be fair, when I bought the computer new in 2015, the hard drive was curiously a 5400 RPM model, presumably either to make it less expensive, less power hungry vis-à-vis battery life, or both.)

As an aside: My favourite brands of castoffs have been, in order, IBM / Lenovo ThinkCentres, then Dells. After that, I’ve had an excellent experience with a single used HP desktop that has been doing computations for World Community Grid running at 100% capacity, since late summer 2016. I’ve dealt with other types of computers, but the ThinkCentres and the Dells have been the ones I’ve had the most success with, or at least the most personal experience. (Since initially writing this post, I have been developing a suspicion that based on the longevity of the HP cast-off I have, HP actually might be superior to the IBM / Lenovo when it comes to cast-offs; however, since it’s the only HP cast off that I can remember ever having, it’s hard to form a proper opinion.)

But to wit: Over the past two weeks, I have tried to revive three used computers that were cast-offs.

Two of them were IBM / Lenovo ThinkCentres, which I think were new in 2006 / 2007, 2.66MHz 64 bit dual cores, 80GB hard drives, and 2 GB memory. The third computer was a Dell case with only the motherboard (proving to have been — see below — a 64 bit dual CPU running at something like 2.66MHz) but no memory, no hard drive, no wires, no DVD player, and not even a power supply!

The two ThinkCentres were from a pile of old computers marked for disposal at a location where I happened to be in mid-2017, and I was granted permission to pick and choose what I wanted from the pile. I gave them to my brother, who at the time evaluated them and determined that neither worked, one just beeping four times and then hanging. After that, they just sat around in his apartment for whenever they might come in handy for spare parts. He had since determined that one actually worked, but he hadn’t done anything with it.

The third computer was found on the street near home a couple of months ago, and was covered with about an inch of snow by the time I’d recovered it. I brought it home, and let it sit around for several weeks just to make sure that it dried out properly. Based on the “Built for Windows XP” and “Vista Ready” stickers, I’d guess that it was new in about 2005 or 2006.

Having forgotten about the ThinkCentre computers I’d given to my brother in 2017, I casually asked him if he had the requisite spare parts to make the snow-covered computer work, since we normally share our piles of spare parts retrieved from old computers that die. To my surprise, he sent me the functional ThinkCentre. My knee-jerk reaction was “I don’t need a new-to-me computer; just the parts required to see if I can get the snow-covered computer to work.” Perversely, I didn’t actually want the results of my planned efforts to produce a functional computer; I just wanted the amusement of a small project, and more generally to see whether the Dell found on the street would work.

In parallel, my home server on which I hosted my backups and my website, another computer of the used several times over variety, worked perfectly except for mysteriously turning off on its own a couple of times recently, perhaps once a week. My brother and I decided that what was probably happening was the result of one or more thermal event(s) which shut down the computer, no doubt due to a combination of dust accumulation, the CPU fan ports in the case not having enough clearance from the computer next to it to allow for proper aspiration of ambient cooling air, and possibly high heat generation from occasional loads due to search engine bots crawling my website. Despite cleaning out the dust, removing the computer’s side panel from which the CPU fan drew air, and shifting both computers a bit in order to allow for adequate ventilation, the computer turned itself off again after about a week.

My brother and I made a swift decision to replace my server with a new installation on a “new” computer — the good ThinkCentre I initially didn’t want — because even though the existing machine was otherwise performing spectacularly well given the overall small load, we tacitly agreed that the shutdowns were a problem with a production server, though we hadn’t actually said the words. This incidentally dealt with another curious behaviour exhibited by the existing server which appeared to otherwise be completely benign, and hence perhaps beyond the scope of why we changed the physical computer.

The operational ThinkCentre was plugged in, formatted with Fedora 31, and my brother helped me install the requisite services and transfer settings to the new server in order to replicate my website. Newer practices in installation were implemented, and newer choices of packages were made. For instance, the “old” machine is still being kept active for a bit as a backup as well as to maintain some VPN services — provided by openVPN — for the purposes of setting up the new server and installing WireGuard for VPN on the new server, and generally allow for a smooth transition period. Other things that we had to remember as well as learn, perhaps for another time, were to install No-IP as a service, and that drive mounts should be unmounted and re-mounted through rc.local.

One of the unexpected bonuses to the upgrade is that it appears to be serving web pages and my blog a wee bit faster, for reasons unknown.

In a few weeks, I’ll reformat the old webserver and make it another computation node for the World Community Gridin fact, this particular machine’s “original” vocation when I first got it in late 2017.

In the meantime, on the next project, I got the non-functional ThinkCentre for its spare parts. The first idea I had was that maybe this second ThinkCentre might still be good, and we looked at a YouTube video that suggested cleaning out the seats for the memory sticks with a can of clean compressed air. I was suspicious of this but let it go for a while, and I proceeded to harvest parts from the computer after deciding that the machine wouldn’t work regardless.

A power supply, cables, a hard drive, and memory sticks were placed in the Dell found on the street. It powered up, and after changing some settings in the BIOS, I was able to boot up a Fedora 31 LiveUSB. Using the settings option from the Gnome desktop, I was able to determine that there was a 64 bit dualcore CPU running at about 2.66GHz, that the 2GBs of memory I’d inserted worked, and that the 80GB hard drive was recognized. I looked around on the hard drive a bit with a file manager (Nautilus) and determined that the place from which I’d retrieved the ThinkCentre appeared to have done at least a basic reformatting of the drive with NTFS. I didn’t try to use or install any forensic tools to further determine whether the drive had been properly cleaned, or had merely received a quick reformat.

Suppertime came around, and the machine was left idle to wait for my instructions for about an hour or so. When I returned to the computer, I saw an interesting screen:

“Oh no! Something has gone wrong.” error screen

(If you can’t see the picture above, it’s an error screen, vaguely akin to a Windows Blue Screen of Death.) After a few reboots, all with the same “Oh no!” error screen, my brother suggested that the machine may have been thrown out for good reason, intimating that it was good luck that I’d even managed to boot it up in the first place and look around a little bit. I, on the other hand, was relieved: I’d had my evening’s entertainment, I’d gotten what I wanted in the form of working on the machine to determine whether or not the machine could be used, and I’d learned that it indeed couldn’t be used. Parts were stripped back out of the Dell, and the box was relegated to the part of the garage where I store toxic waste and old electronics for the times I have enough collected to make it worthwhile to go to an authorized disposal centre.

At this point, something was still bugging me about the second ThinkCentre. I hadn’t yet placed my finger on it, but I was suspicious of the “use compressed air to get rid of the dust in the memory bays” solution. So I placed the salvaged parts back into the ThinkCentre — having fun with which wires go where in order to make it work again — and got the four beeps again. I looked up what four beeps at start up means (here’s my archive of the table, which I had to recreate since a direct printing of the webpage only printed one of the tables,) and found that at least on a Lenovo ThinkCentre, it means “Clock error, timer on the system board does not work.” While I assumed that changing the BIOS battery may well fix the problem, I decided not to investigate any further.

I salvaged the parts again and placed them in my parts pile, ready for the next time I find a junker on the street or from elsewhere. The second ThinkCentre’s case was also placed beside the Dell, awaiting my next trip to an authorized disposal centre.

This means that out of the last three computers, I have one functioning computer replacing an existing computer (that I hope will continue with an industrious afterlife doing something else), one computer scavenged for spare parts and the case relegated to the disposal centre pile, and the Dell computer which was found on the street also relegated to the disposal centre pile.

Or, to paraphrase Meat Loaf, “One out of three ain’t bad …”

Linux Meetup Montreal — Présentation

Cette page est principalement une place à exposer le lien pour ma présentation de ce soir au Linux Meetup Montréal au sujet d’utiliser le SSH et le SSHfs pour l’accès aux fichiers sur des autres systèmes depuis votre ordinateur linux (Fedora avec Gnome, dans mon cas).

https://www.malak.ca/linux/20200303présentation.pdf

Essentiellement, je discute le fait que SSH et SSHfs peuvent être utilisés pour les transferts des fichiers, et comment, à la base, les invoquer.

*****

This is a page to expose the link for my presentation this evening at the Linux Meetup Montreal discussing SSH and SSHfs for file transfers on other systems (Fedora with Gnome, in my case).

https://www.malak.ca/linux/20200303présentation.pdf

Essentially, in the presentation I say that SSH and SSHfs can be used for file transfers, and in a basic way, how to invoke them.

And yes, it’s in French. Deal with it. 🙂

Document Formatting When Joining Texts From Various Sources

I have mounted, on a volunteer basis and in a lay capacity, the annual reports for a community group to which I belong, since about 2008.

Up to that point, the group’s annual reports were individual committee reports delivered to the secretary, individually printed out as and when received, and then stapled together with handwritten pages numbers when it had to be distributed, with an added cover page, and an extra page listing the reports and their page numbers. This did have the charm of not requiring a herculean effort and time requirement in both mounting the report, and on “printing day”, to print literally a thousand pages or more, depending on the number of pages to the report and the number of copies to be drawn. Admittedly, it does not take into account possible collating, as per how one might print out the reports (ie. pages with colour drawings and photos vs black and white, etc.).

The year I took on mounting the annual report, I believed that the annual reports should have been in an electronic format such as PDF so that it could be placed on the group’s website. But that was barely the beginning of why I took on the job.

To fulfill the technical goal of making a PDF for download from the website was not too difficult. Two easy options would have been to either scan the report once produced the “old fashioned way” and produce a PDF from all the images, or, at least for those received in electronic format, create individual PDF documents plus scan for those received on paper, then use a PDF joiner to string the PDF files together into a single document. In fact, at the time, I gathered as many previous annual reports as I could and scanned them, making them available on the website.

However, going forward, I did not consider either option to be satisfactory.

The aesthetic appearance of the annual report irked me. It wasn’t the old school printing on paper — to this day, I still print lots of paper copies for distribution. Rather, I saw an opportunity to put to the test some angst stemming from a bit over a decade earlier when the community group’s recipe book to which I’d contributed led to my having had a few ideas on improvements to the text’s basic formatting and overall layout. (The actual recipes, variety, organization, editing, and recipe testing that I learned went on behind the scenes, and the like, were beyond the scope of my interest, although one common error, separate from my angst, was a mild nuisance.) I of course wisely kept my opinions to myself, both at the time of the recipe book in the mid 1990’s, as well as at the time of initially volunteering to mount the annual report.

As can be surmised from the above, each report came from almost as many different people as there were reports, depending on how many committee reports given individuals would take on. Each person would typically type their report on their computer and email it to the office, or perhaps print it out at home and drop it off at the group’s office personally. They used whichever word processor they had: Sometimes simple text editors, or MS Write, or MS Word, presumably ranging through Word 98, Word 2003, and Word 2006. Presumably some people had Macs with whichever word processor they might have had. I believe that the secretary, who was sometimes typing up the reports which were submitted handwritten, was using a version of Wordperfect. Finally, I was submitting my reports at that point using OpenOffice.org. Presumably, there may have been other text editors or word processors used. Each instance presented a random opportunity for default settings to be different, as well as for the user to change the settings to those that suited their own personal taste.

As a result, each report predictably had formatting unique to each author, sometimes unique to each individual report, if two or more reports were submitted by the same person.

The various differences in formatting in the reports received included the following, without being an exhaustive list:

– varying text fonts and font sizes, and occasionally, more than one of either or both in a given report;
– varying line spacing;
– varying paragraph indentation, including lack thereof;
– line jumps or lack thereof between paragraphs;
– varying page margin widths;
– varying text alignment, typically either left justified, or left and right justified;
– the occasional use of italics over the whole document, beyond that which would normally be used;
– the inclusion or lack of section titles, sometimes (or not) rendered bold and/or italicised and/or underlined and/or capitalized;
– tables listing figures in formats unique to each table and report, or simple lists with varying bullet styles;
– varying spelling conventions, ie. American vs. British vs. Canadian spellings (ie. neighbor vs. neighbour, or center vs. centre);
– varying naming conventions: Sometimes full names, sometimes initialized first names with full last names, sometimes full first names with initialized last names, or sometimes very informally with only first names;
– varying honourific format conventions: sometimes honourifics, titles, and/or ranks would not be used, with persons simply named, and sometimes referred to with variations of their title such as Reverend, Rev., The Reverend, The Rev., etc.
– varying naming conventions for committee names, multi-word names, places, and the like, which were sometimes fully spelled out, and sometimes initialized, abbreviated, and / or contracted;
– etc.

As such, as alluded to in a previous post, minor changes and differences in formatting between the individual reports created subtle (or, depending on the changes, more obvious) visual changes in how each report appeared compared to each other, when joined and printed on paper or read on a computer screen. Multiple permutations and combinations of the above formatting issues often led to creating wildly varying end results which go beyond the subtle, creating a patchwork of formatting over the multiple reports joined together into a single document. This may be jarring to the eye of some readers, particularly when it is not a subtle, unified, overarching design choice, but rather the result of a decided lack of unified design choice.

This link shows a hypothetical example of how such a report could look (you’ll need a PDF reader) — with various individual reports each having unique blends of formatting as compared to each other. Note that I intentionally use the “Lorem Ipsum” text so as to highlight the formatting.

The obligatory let’s tie it all together part at the end:

When I collect the individual reports and create one document, I cut and paste all the electronic reports (and rarely, type up handwritten reports) into a single document, imposing a uniform text formatting throughout in the form of a standard font, font size, line spacing, (lack of) paragraph indentation, page margins, and standardized and / or uniform versions of the other items above. Pages are automatically numbered, and standard page headers and footers are automatically added throughout, with date codes to distinguish between earlier and later versions. Basic spelling and typing conventions are applied and made uniform. Note that I don’t dictate or edit writing style, so one report might have section headers, while another may not, nor do I edit for turns of phrase and the like.

This link shows the above hypothetical report changed (you’ll need a PDF reader) to show the same reports with some basic text formatting across the whole document made uniform, while allowing each author’s text flow (and implicitly, were each text to be unique, writing style as well) to remain relatively untouched.

Have I addressed my angst from the mid 1990’s? Yes.

Is the document formatting on the annual reports I produce every year a work in progress, with subtle improvements, changes, and the like every year? Of course.

A text formatting riddle

I’d like to propose my version of a little visual puzzle I saw years ago. In the following table, the same text is repeated in each cell. In eight of the cells, an element of formatting has been changed from the appearance of the text using a basic set of formatting, while the ninth contains, in this case, the default settings on my wordprocessor on my system. The riddle is to find which cell has not been modified as compared to the other eight. (View a slightly larger version of the table here.)

A hint of sorts: What the basic formatting settings are, or which word processor I used on which system or OS, all represent red herrings to solving which is “the original”, or “vanilla”, version.

a text formatting puzzle

Scroll down for the solution.
Scroll down to see the solution
The solution is B2, the cell / square in the centre of the table.

All the other cells have one thing changed from the B2’s qualities.

A1) The font was changed (from a Serif font to a Sans Serif font);
A2) The font size was changed to a slightly larger point size;
A3) The cell’s background colour was changed to a light grey;
B1) The text was italicized;
B2) Standard, unchanged text using my word processor’s standard settings;
B3) The text colour was lightened from a standard black to a grey;
C1) The text was capitalized;
C2) The text was made bold;
C3) The text’s line spacing was increased.

Besides at the core being what I perceive to be a fun riddle, it demonstrates how subtle differences can be made to standard document formatting in a variety of ways. It also alludes to the challenges presented by receiving documents from multiple sources for integration into a single document, such as a community group’s newsletter, or a community group’s annual report, presenting content and / or reports from its various members, leaders, subgroups, committees, and the like. In a forthcoming post, I will further discuss basic issues of varying formatting, and the need for standard formatting in a text document from the perspective of a layman editor of a community group’s annual report.

Hotel WiFi Passwords — 2018 edition (aka what a snore fest)

Yet again, I am in a hotel using their wifi. Again, after being asked during check-in if I wanted wifi access, I was curious about how their wifi password would stand up to any kind of security test as they handed me a slip of paper with the information.

Sigh, it is a terribly obvious password that would only barely pass a “security by obscurity” test by virtue that by and large, people don’t have wifi guessing software with standard dictionaries ranging from a normal library dictionary to a hacker dictionary that anyone’s 11 year old could probably compile, certainly with the help of their friends. In fact, while there are no doubt dozens, no hundreds, no thousands of “obvious” word combinations that would meet the following criteria, it in fact is obvious that it is intended to be very easily remembered by an overwhelming majority of people, be they a typical everyday-anyone-off-the-street person, or a tech savvy person, or a forgetful person, or children, or “even your mom” (I am trying to delicately refer to my mother, who is both not tech savvy in the least, and very experienced in life, if you take my meaning.)

Back in 2015, I was on the subject again, having been impressed at least that the wifi password given to me appeared to be auto-generated at check-in, and obviously not susceptible to simple dictionary attacks.

I started this rant on hotel passwords in 2009 during a series of business trips in which I was at a lot of hotels, and was frustrated for the innkeepers that their wifi would have been so easy to steal for the cost of a night at the hotel and a series of repeaters in the bushes.

Since then, however, I came to realize that my concerns were a bit overrated. Firstly, the potential of signal theft in that fashion was only really was useful for neighbours of the hotels. Secondly, the technical aspects of providing multiple repeaters and power cords down the street (or as the case may be, through the woods) make the cost, both financial and in terms of maintenance, somewhat impractical beyond a few hundred feet.

This is based on some personal experience of the legitimate variety: Since about 2011, my neighbour at the cottage has had internet provided through, I believe, line-of-sight microwave service; it includes VOIP service to provide telephone service, which apparently is prioritized within the router setup. He kindly gave me the wifi password. After about a year, I installed a wifi repeater so that it could be useful within the house, since there was only about one location within the house within a usable radius of the neighbour’s router (a solid two to three hundred feet away); fortunately, I could plug in the repeater at that location. I have since also been giving him some money annually in appreciation.

What have I found?

The repeater is useful. It itself provides constant signal, although it has been susceptible to things like weather, tree foliage, and the like. And, unfortunately, the general service seems to be susceptible to the same, plus things like mountains, and probably the dozens of customers just on my lake and neighbouring lakes. (Yes, people keep on complaining, and no doubt the suppliers’ techies just shift “prioritizing” their services to each successive round of complaining customers, at the expense of the rest of their customers.)

But to wit, the quality of service, at least on the repeater we have, is only barely useful for things like YouTube and the like under the best of conditions; the speed drop from beside the router to our repeater is such that we were able to demonstrate to our neighbour that even if we were consuming such services, we could not be the source of the fluctuating service affecting his internet service (see above.) In any case, by and large we respect a request from him that we not use it to stream video and download large files, since his usage is also metered.

My brother has been wanting to improve our end of the signal for years by setting the repeater near the edge of the property, closer to our neighbour, with things like “waterproof boxes”, electrical extensions, and Ethernet cable through the woods a bit, and then hanging in the air above the clothesline. I have been responding bah humbug, it seems far too susceptible to the elements. As a former geocacher, the notion of a “waterproof” container left out in the woods is no simple feat, and even were it to remain locked, it — and the power cable, and the Ethernet cable too — likely would become susceptible to the elements in short order, and not worth the maintenance effort. It seems to be a challenge beyond most commoners such as myself and even I suspect my brother, more along the lines of the phone company or electric utility face on a daily basis. Remember how annoying it is when the power goes out or the telephones (landlines or cell network) don’t work? Why do they have local teams on the ready 24 hours a day to deal with this? Such outages are regular due to trees falling, water infiltration, and the like.

Is it really worth going to all this trouble in order to have a series of repeaters going down the street for free wifi? I doubt it would be useful to any real degree except to demonstrate proof of concept to your friends for bragging rights.

So … does it really matter how easy it would be to hack a hotel’s free wifi?

Obviously, to the hotel and any costs incurred, of course. The reduction in service and inconvenience that in principle such a signal theft may cause to the hotel and its guests? Of course. And, any illegal activities in which such illicit users may be engaging (kiddie porn, spam, financial fraud, etc.), of course it matters.

But, is anyone beyond the immediate neighbours going to bother with the series of repeaters and power lines through the bushes and/or down the street, possibly spanning several blocks and neighbourhoods?

I have to say “Poppycock!”

PS The “snore fest in the title” was not meant as a pun, but realizing that it unintentionally is — well, I like dumb jokes and puns, especially the dumb ones. 🙂 So, keeping it is intentional.

Updated recipes

I have been adding my personal recipes to malak.ca since the beginning of December, 2017.

It has been a sort of starting from fresh to create my personal cookbook, a project I started, I think, long before 2011 — as early as 2007-ish, as I recall.  (I remember discussing the cookbook with someone somewhere around 2012, and said conversation could not have occurred before 2011.)

Several years ago, I’d put together a personal cookbook, but at a certain point during its construction, somehow the main file either got corrupted, or I had several copies which I didn’t manage properly (and presumably, in this scenario, began overwriting previously entered recipes with newer versions of other versions.)  However it all happened, I became disillusioned and lost interest on a practical level to reconstruct it all, let alone finish it, despite a certain allure it had.

Back in December, I decided to start from scratch, doing a rather 90’s thing — or perhaps even an 80’s, or 70’s, or 60’s thing — I used a basic text editor and started retyping each recipe, sometimes using what I did still have as a reference, and in at least four cases, just reusing the recipe as I’d entered it back a few years ago, with the remnants of the original cookbook file.

In the case of some of recipes I’ve been typing in, I’ve actually been able to tune the text based on recent memory of just having made the items in the last couple of weeks (as in, as I was making the item in question, going over to my computer to make adjustments), or up to a couple of months ago.

I started posting pdf’s on my website.  And, I’ve been using a “post early, post often” approach to each recipe, as in, check recipes, fine tune them, repost the update, and, fine tune them again, adding sections like “equipment” as I’d start to be using that in other recipes, and so on.  I even have been recalling a lightning talk I rather liked at a linux conference I attended in 2011 which, ironically, used baking and recipes as a way to demonstrate the need to developers the importance of clear, concise, and complete instructions and documentation in order to encourage others to join their software projects.

And, fun fun fun, today I took advantage of another day of holidays, er, waiting for the garage to call me back and say that my car, in for servicing, was ready:  I went through all my recently-typed recipes and did some basic editing.  Lists and sentences / semi-sentences were capitalized.  Lists received dash points.  Instructions which hadn’t already been fleshed out, were fleshed out.  Sentences with multiple steps were broken up into discrete instruction lists.  A number received sections “do this part, then while that cooks, do this part”, etc.  (And then, transferring the updates to my webserver, to my laptop as a backup, and to my backup server which is also my webserver.)

Obviously, the likes of “cooking sausages” isn’t there, even though apparently when I make them for a Santa’s Breakfast, they are highly rated beyond the fact that I’m the only volunteer who actually relishes in making 200+ sausages at home in advance.  And, that having the sausages pre-cooked so that they only need to be reheated in the oven is quite convenient when you’re serving 100+ people.

Eventually, if you look at the eggplant, first meatball, cheese biscuit and zucchini dish recipes, I may update them in the style of the newly retyped recipes as above, while converting the texts of the newly retyped recipes to that format (the original format for my “personal cookbook”), and take photos.

Finally!  My recipes are now documented, accessible, shared, sharable, and, if I ever get around to it, ready for transfer into a “cookbook”.

New World Community Grid Node

I started volunteering some of my extra computers’ idle time for the World Community Grid in December, 2013.  Unfortunately, the machine in question, a used computer I’d bought about five years earlier and, after having been used as a desktop for a few years, had been converted to being a server under CentOS, died from a “thermal event” nine months later.  It had completed 713 results and earned 419,591 points.

In 2016, I found a P4 3.4GHz machine, installed CentOS 7 on it, and then the BOINC infrastructure.  I assigned it to the World Community Grid and 100% of its capacity to the project.  From when it began in September, 2016 to today, it has completed 4,540 results, and earned 2,568,590 points.

In 2017, I finally converted my old netbook (32 bit atom processor) to CentOS 6 and did the same thing.  From when it began in April until today, it has completed 261 results, and earned 133,073 points.  (What a difference in capacity that 3.4GHz 64 bit has as compared to 1.6GHz 32 bit!)

Over the past few months, I have been collecting up a number of old machines which have come my way, including some IBM ThinkCentres from the Windows Vista era.  So far, my brother and I haven’t been able to get them running properly, and we will probably end up using them for spare parts.

In the meantime, we acquired two more computers.  My brother wanted / needed a replacement computer for his aging media server, an old reclaimed IBM ThinkCentre I’d gotten for him a few years ago.  I, in the meantime, wanted to add another node to the World Community Grid (of course, working at 100% of capacity.)

I chose CentOS 7 for this build, like I did for my other nodes, for what I consider to be the obvious reason that I want to pretty much forget about the computers and just relish in the numbers on the World Community Grid website — I don’t want to be re-installing every year!

The install went well enough, although it was long enough process for the base install, as compared to my laptop and desktop.  I will rule out the comparison to my laptop since the SSD and physical drive don’t compare at all.  As for the desktop and node, I’ll chalk up the difference mainly to processor speed and general architectures:  A 2015-era four core i5 running at 3.4GHZ vs a 2010 era Pentium dual-core E6500 running at 2.93GHz (no HyperThreading).

What was really long after that was the yum update after the initial install — about 650 packages!  In the process of the updates, I tried a few things like web surfing, and the gnome desktop became unstable; I ended up with a flashing text screen.  I finally rebooted, and tried to downgrade to an older kernel in GRUB, to no avail.  I tried the rescue kernel, no avail.  Under both situations, I couldn’t pull up a terminal with Alt-Ctrl-F2.  A quick check under a Fedora live environment was a waste of time, since I didn’t really know how to diagnose things; however, I was able to mount the CentOS drive.

There was some flirting with the idea of installing Fedora 27, but I don’t want the re-installation mill on this machine (or any of my other volunteer computing nodes) every year — although, seeing my brother upgrade from Fedora 25 to 27 through the GUI go as smoothly as a routine DNF upgrade is making me wonder if the point is moot.  (Note that CentOS 7, based on Fedora 19, is still using YUM, while Fedora has been using DNF since version 22.)

Finally, I restarted the install of CentOS, this time doing a minimal text install.  Things were a touch faster.  Then I did a yum update, with only about half as many packages to update.  After that, I installed the Gnome Desktop on the machine. (Here’s my archive.)

I continued with the installation of the Fedora EPEL repository (as root “wget http://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm”, then “rpm -ivh epel-release-latest-7.noarch.rpm”).  Installing the BOINC infrastructure was easy:  As root “yum install boinc*”.

I launched the BOINC manager from one of the pull down menus, and, to my surprise, it actually worked out of the box, unlike previous installations.  Someone must have updated the packages. 🙂  I added the World Community Grid website information, and my account and password.

Voilà!  At 12:00 UTC the next morning, my machine had already submitted FIVE results, and earned 2,429 points!  And, at 00:00 UTC as I’m completing this post, a total of EIGHT results, and 4,638 points!

Upgrades to Fedora 27 — what a breeze!

Over the past two weeks, I have upgraded two computers to Fedora 27 (from Fedora 25, having skipped Fedora 26 and enjoyed roughly a year’s worth of Fedora goodness).

The two computers are:

  • Dell desktop (main system):  Intel® Core™ i5-4460 CPU @ 3.20GHz — no Hyperthreading, 1 TB 7200 HD, 8gigs memory; screen upgraded separately to an Acer widescreen, and old screen relegated to a “new to me” computer setup as a node on the World Community Grid.
  • Acer laptop (secondary system): Intel® Core™ i7-5500U CPU @ 2.40GHz (Hyperthreaded), now 500gig SSD HD, 8 gigs memory.

Two of the equipment upgrades are the screen on the desktop, which is now a used Acer widescreen, and the laptop’s 5200RPM 1TB drive was upgraded to a 500gig SSD.  The laptop went from interminably slow to incredibly fast!  The comment from my brother:  “SSD’s are one of the few things that actually lives up to the hype.”  In my experience — under linux, anyway.  Under a corporate controlled windows box?  Well I’d say that my work computer, with an SSD, needs the SSD speed just not to be unusable!

The upgrades were incredibly easy this time, and fast, the new SSD installed on the laptop probably being the big factor.  In fact, I was able to do the basic install in about 15 minutes, and the rest of my list (made for Fedora 23, but the basic list is still valid) was easy to complete while on a business trip in the motel room during off hours.  In fact, one of the things that took a couple of days to realize:  Fedora has had difficulty with the UEFI on this machine in the past — it would install, and then not work, and I’d have to reinstall under legacy BIOS.  Note that I have a BIOS password, so perhaps in the past I just figured out how to make it persistent.  As for restoring the data, once home, I managed to easily copy all my data files from my desktop overnight.

As for the desktop, having just gone through the process a couple of days earlier with my laptop, I was able to easily update, and then re-transfer my data from the laptop overnight, as well as update my data backup on my home server.

The “big” thing this time?  The hardware upgrades.  The almost un-noticeable thing this time?  The installs, which were incredibly easy, quick, routine, and almost easily forgotten.  Sheesh, I’ve lost track of how many installs I’ve done over the years …

 

malak.ca updated

Since about lat 2016, my website had problems with uptime:  It was mostly down.  In the spring of 2017, it was finally up and I did a bit of restoration work.  And then … it was down again for a few months.  (And, due to the circumstances of this downtime, my restoration work was lost.)

Finally, I transferred my website to an existing home server, and it is now living out of a computer which I believe may be as old as 2003, living under CentOS 7.x series, in my bedroom.  Having fixed a faulty telephone line (squirrels!) the line is now “not noisy” and the internet is back properly.

Main work has been: