Tuesday, 5 June 2012

Recruting Call

This was first published in the FLOSSUK newsletter.

The popular movements that are sweeping the world, Occupy, Anonymous,
the Indignatos and the revolutionary movements of the Arab Spring are
now treading a path that to someone who has been involved in the free
software movement for more than 20 years seems rather familiar.

Fighting against the machinations of 'evil corporations' is a fight
those in the free software world know only too well.

One of the slogans of the Occupy movement is "Nobody Wins unless
Everybody Wins" - a realisation that we can never have the peace
needed to concentrate on the job of surviving the end of the age of
cheap energy while there is such gross injustice in the world.

In our world, free software can be freely copied. If I share my code
with you we both still have it, and in fact the act of sharing often
makes the thing shared much better for all the parties doing the
sharing. However realists would argue you cannot do that with a cup of
rice. Sharing my cup of rice with twenty other people is of no help if
everybody starves.

I mention the cup of rice because as I write this a recent news story
is that the rice harvest in India has been such a bumper one that
there is not enough storage space for all of it and many tons are
simply being left to rot in the open air. Harvested, in sacks but
being wasted!

Why does it not go to feed the poor, better surely than it just
feeding rats? The pathetic excuse was to do so would affect 'the
balance of payments'.

I am calling on you, the people who already know how to co-operate
with others to develop software for mutual benefit. Spare some time in
this time of change to see how your skills and experience could be of
help to the people trying to solve the much bigger problem of making
the physical world - sharing out those cups of rice, work in the same
"It is smart to share" way.

The machinations of corporations over the past decades has brought
into being much 'property' of the intellectual kind that really has no
business being property.

The ACTA treaty that they recently attempted to foist upon is is a
very clear example of this. Yes, we have the motivation to get angry
about it because of the way it could be used by 'rights' holder to
chill any innovation or use of anything similar to their technology or
artistic work without the payment of extortion.

This bully boy tactic would also extend to areas such as drugs and
with the excuse of genetic modification to foodstuffs, allowing the
multi-nationals to have an even tighter grip on the life or death of
the ordinary people.

What is perhaps even worse is the general stifling of free speech and
free debate that measures such as ACTA, CISPA and our home grown
variants like the Digital Economy Act - rail-roaded through in the
last week of the Brown government, and the latest attempts to monitor
and censor or Internet for our own good.

As the Internet is the means by which the human race is finally waking
up to the truth that the control of the world is by a very small
number of very powerful people, is it any wonder they are trying
everything they can think of to neuter its power?

It is too late for that, dissent has reached critical mass.

Here is an example that happened here in London. Occupy held a march
on 11th May ending with a meeting and teach in on the small piece of
land in front of the Bank of England and the Royal Exchange. The
Police attempted to break up the meeting on the pretence that it was a
nuisance to residents (what residents?). A few dozen protesters
decided to peacefully resist, linking arms to thwart police attempts
to remove them. There was a few arrests  then a stand-off for several
hours, watched avidly by a large audience across the Internet until
the police just gave up and left. As the City of London is pretty
comatose on a weekend the allegation that the people holding a meeting
were being a 'nuisance' would not have cut any ice with a jury until
Monday morning! Occupiers then left as it was time to sleep and the
point had been made. No matter how much those in power wish it to be
otherwise enough people have now lost their fear. The cost of just
doing nothing and hoping that this is all just an 'economic cycle' is
simply too great.

Yes working towards a new system carries risks, but they are much less
than the risks of just pretending the problems will fix themselves.

This is where YOU come in -  I think that people who have had the
experience making this culture work in the realm of software will have
a very important part to play. Go find where you will be useful and
volunteer today.


You can read the full manifesto of the global freedom movement, which
has already been printed in several newspapers, here
http://www.globalmay.org/blog/item/95-globalmay-manifesto-template-v3.html
- You really should read it, a great deal of thought and debate has
gone into its production.

And a summary of what is wrong and what is needed to fix it from a
specifically UK perspective can be found in the letter that the
Anonymous sent to David Cameron -
http://anonflag.com/cameronmessage.html

The audio version takes about 25 minutes.

Beware of the Leopard!


"But look, you found the notice didn't you?"
"Yes," said Arthur, "yes I did. It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard'."

Not quite the marketing brilliance of the Arthur's local planning office in "Hitch Hikers Guide to the Galaxy" but yesterday evening, while the Jubilee concert was in full swing the BBC finally decided to start letting people know we are in a bit of a predicament!

"Surviving Progress" http://www.bbc.co.uk/i/b01jrlsf/ - you only have 7 days though.

For people used to the truth being already awake on the Internet this is all fairly tame stuff but the fact that it has been broadcast on UK TV is a watershed. Please put pressure on the Beeb to repeat this at prime time on BBC One with some appropriate publicity for the importance of the message here. Forget "Eastenders" the football, even the s***ing Olympics - this is far more important to the viewers.

Show this to people who still have confidence that mainstream media is telling the truth. There are very powerful vested interests that do not like us being told that money is all concentrated with the elite and they are in no mood to give it back!

Get your friends to watch it and then move on to the likes of Crash Course or this:




UK Government Consultation on Open Standards

This is something that the UK is sadly lagging behind much of the rest of the world in. However the current financial dire straits that the UK Government finances are in has at last meant that interest in Open Standards and by implication Open Source/Free Software is more than just 'making encouraging noises'.

The process seems to have the proprietary software companies rattled in that they are putting a lot of effort  into making their point of view held. So much so that the consultation process has been delayed (it was due to end 1st May) because of an undeclared conflict of interest of the person facilitating discussions. The new deadline for consultations was Monday 4th June 2012.

The Free Software Foundation Europe has already issued its view on this. What follows here is my personal take on things.

First some general principals, why I think it would be a good idea for the IT that we the taxpayers pay for would be better done in as open as possible manner.

Modern computing is on the whole really complex. It is true that sometimes by clever design that complexity can appear simple but that does not change that fact that beneath the façade things are still complex. This is something generally true of our modern world.

Well here are the questions and my answers:


Criteria for open standards

1. How does this definition of open standard compare to your view of what makes a standard 'open'?
For a standard to be open, it must be available electronically at no cost to anyone who wishes to obtain a copy (a fee for printed copies is acceptable). There must be no requirement to pay royalties by anyone who produces or uses software which implements the standard.
Being royalty free is important not just in being a cost issue as implementing anything meaningful is naturally going to involve costs, many of which will be financial. It is a matter of principal that standards are jointly agreed for the benefit of the parties concerned. There should not be monetary advantage to just some of those parties at the disadvantage of the others. If a standard is NOT to the benefit of all parties (such as being able to freely exchange information, avoiding lock-in) it should not be graced by being a standard.
2. What will the Government be inhibited from doing if this definition of open standards is adopted for software interoperability, data and document formats across central government?
What will be inhibited are practicies to the detriment of the taxpayer who pays for the operation of government. Open standards will encourage work to be done once and reused/evolved. This means all stakeholders get the best return on their investment possible. As we live of a planet with finite resources this is a way of working we are going to have to get used to.
3. For businesses attempting to break into the government IT market, would this policy make things easier or more difficult – does it help to level the playing field?
Open standards will make it easier for smaller players to enter the government IT space in that the work will more likley be done using standard building blocks such as the LAMP stack and the well tested CPAN archive for the Perl language. Having well used componenets means that systems can be built with less surprses and mysterious 'cost overruns' which would otherwise deter companies with less resources from bidding for work.
4. How would mandating open standards for use in government IT for software interoperability, data and document formats affect your organisation?
Not applicable, as we do not directly interact with government departments.
5. What effect would this policy have on improving value for money in the provision of government services?
Open standards demystify IT and increase the number of people with skills they can take a pride in using, rather than just learning to use some capricious and arbitary, badly thought out system by wrote. The open standards based Raspberry Pi computer is a great example of this - a fully working general purpose computer for a consumer electronics price that is revolutionising IT education and beyond. Having work done by a motivated local (i.e. spend their wages in the community) workforce rather than sending millions back to global corporations by way of licence fees and other intellectual property related charges is clearly going to be a wiser way to spend taxpayers money.
6. Would this policy support innovation, competition and choice in delivery of government services?
Absolutley! Having a rich pallette of openley discussed and developed technologies to choose from and allowing ones technical staff to freely co-operate with like minded peers should allow the best solutions to bubble to the surface. There will be none of government being frced to have to make do with third rate solutions just because of contractual obligations. If someone develops a new algorithm that does the same job in half the time, one would hope that a standards body would be flexable enough to let all participants take advantage of genuine innovation.
7. In what way do software copyright licences and standards patent licences interact to support or prevent interoperability?
If software licences are restrictive they could be a very great drag on interoperability. So much so that such software really needs to be considered if it can be replaced. In a system run for the benefit of man we should be aiming for the best interoperability conducive to efficency and security.
8. How could adopting (Fair) Reasonable and Non Discriminatory ((F)RAND) standards deliver a level playing field for open source and proprietary software solution providers?
FRAND is still licencing so should be absolutley rejected. If something is part of a standard it should be shared without any restrictions being placed on it, including being used as a building block for some other standard.
9. Does selecting open standards which are compatible with a free or open source software licence exclude certain suppliers or products?
Yes if they have a corporate mindset obsessed with placing a monetary value on everything.
10. Does a promise of non-assertion of a patent when used in open source software alleviate concerns relating to patents and royalty charging?
No, as a promise can be withdrawn or broken.
11. Should a different rationale be applied when purchasing off-the-shelf software solutions than is applied when purchasing bespoke solutions?
If open standards were more widley adopted the powerful building block culture will mean that many bespoke systems are not as bespoke as once thought. These are just two ends of the same continum.
12. In terms of standards for software interoperability, data and document formats, is there a need for the Government to engage with or provide funding for specific committees/bodies?
It is absolutley in the interests of the taxpayer that the advanatges outlined above come to Government IT spending. There would be some very useful scope in funding for oversight on keeping this process on track, particularly if done as a close co-operation with bodies with the same aims in other countries.
Compared to the money currently flowing to corporations for intellectual property the amounts involved would be very small. For over 20 years now the whole Free software/Open source movement has been surviving very well on a large number of small sources of funding rather than very centralised grant awarding infrastructure.
Trying to build such a thing on a grand scale would be a mistake. This needs light touch guideance only.
13. Are there any are other policy options which would meet the described outcomes more effectively?
None that I know of.

Open standards mandation

1. What criteria should the Government consider when deciding whether it is appropriate to mandate particular standards?
If having a standard here would help avoid costly 'reinventing the wheel' by system specifiers.
If interoperability between the proposed system and others is an important part of its design.
If using standards whould make best use of the skill bases of available staff to do the work (e.g. is the benefit of writing the software in a new programming language worth the need to retrain or hire specificaly skilled staff).
2. What effect would mandating particular open standards have on improving value for money in the provision of government services?
If standards means that products for which licence fees are payable could be avoided there would be a great imporovement in value for money in that money would be paid for work done/services delivered in support e.t.c. rather than the nebulous idea of 'intellectual property'.
3. Are there any legal or procurement barriers to mandating specific open standards in the UK Government's IT?
(Left blank as we are not lawyers)
4. Could mandation of competing open standards for the same function deliver interoperable software and information at reduced cost?
Yes as was of doing things with IT have overlapping capability spaces. A good example of this would be choosing to implement a web site in Java or Perl. Perl would have an advantage for things that need to change quickly, but Java for big system engineering.
5. Could mandation of open standards promote anti-competitive behaviour in public procurement?
No, it would help prevent the anti-competitive behaviour associated with certain players holding key strategic pieces of Intellectual Property.
6. How would mandation of specific open standards for government IT software interoperability, data and document formats affect your organisation/business?
It lowers the entry cost of information interchange with the Government. e.g. not having to buy a copy of Microsoft Office just because some government department will only accept docx files.
7. How should the Government best deal with the issue of change relating to legacy systems or incompatible updates to existing open standards?
A legacy system is by definition a stable API so it should be possible to write a set of interfaces to it to make it appear to the outside world like it is part of the new system (although perhaps with much lower performance). It may also be worth-while to simulate the old system using rapid application development methodology to afford a seamless transition into something that can fully interchange data.
8. What should trigger the review of an open standard that has already been mandated?
If adhereing to a standard becomes a significant drag behind the current accepted state of the art. Governemt need not be trail blazers but it is poor value for the taxpayer if they lag behind in using now outdated methods to do things.
9. How should the Government strike a balance between nurturing innovation and conforming to standards?
Governemnt should aim to stay in the middle of mainstream use. Not rush out in front, apart from funding research, nor tail behind due to standards that are too slow to be updated. Staying with the crowd maximises the interoperability benefits this is all about.
10. How should the Government confirm that a solution claiming conformity to a standard is interoperable in practice?
Extensive suites of automated regression tests for all software produced.
11. Are there any are other policy options which would meet the objective more effectively?

International alignment

1. Is the proposed UK policy compatible with European policies, directives and regulations (existing or planned) such as the European Interoperability Framework version 2.0 and the reform proposal for European Standardisation?
(Left blank as we are not lawyers)
2. Will the open standards policy be beneficial or detrimental for innovation and competition in the UK and Europe?
Beneficial as it gives control back to individual technologists, not the legal departments of large corporations.
3. Are there any are other policy options which would meet the objectives described in this consultation paper more effectively?



Wednesday, 25 April 2012

Got a day to spare? Here is some interesting viewing

I have been familiar with the Crash Course by Chris Martenson for a few months: http://www.chrismartenson.com/crashcourse 



and recently also watched the Thrive movie http://youtu.be/lEV5AFFcZ-s



And just last night the Second Zeitgiest Movie http://youtu.be/1gKX9TWRyfs





I would recommend that everyone watches all three of these (yes it will take some time). They are coming from totally different angles, meet in the middle that they identify the root cause of the problems (Money as Debt etc.) but then go off in somewhat different directions to propose solutions.

Crash Course is all science/facts/realism that we have burned through nearly all of our resources now. It is a big commitment in time and if you want to go even deeper there is a very good book of the same name available too. That has a huge set of references back to all the source material. Chris is a scientist so it is all very rigorous.

The Zeitgiest view is that if we only lean to be nice to each other we have sufficient left to properly exploit solar/geothermal etc. so really have less need for the things in increasing scarcity. The remaining oil and gas needs to be treated as a raw material not something to be burnt! Responsible stewardship of the finite planet is of course vital.

The thing that I think is going to be a brake on full scale adoption of the Zeitgiest ideas by the general population is the anti-religion stance. Getting people to radically change the way we live without the carrot & stick of religion may be difficult.

Thrive is very spiritual in contrast. It is very linked to the whole UFO culture and hints that there are new forms of energy (not just wind, solar etc) that the development of which has been suppressed by vested interests. This may be true, but would require the operation of a huge conspiracy to keep going.

All three of these have very good points to bring to the discussion of where the human race goes next. Staying as we are is just not an option, the current system is a sinking ship. This needs to be drummed into the head of every corrupt politician.


Something that can be done quite easily in the short term is the push to roll back the creeping boundaries of Intellectual Property - this is a barefaced attempt by the few to create artificial scarcity. A scarcity of ideas and knowledge. Please spread the word about the next pan European protest against ACTA on 9th June - http://www.facebook.com/events/183572751753787/

Friday, 20 April 2012

Finding a use for my Dreamplug

About a year ago I bought a natty little ARM based system the Dreamplug from NewIT. You can still get them in fact so this is not a story about crusty old technology at all.

A contract and other projects came along and the Dreamplug sadly gathered dust on the shelf for many months.

What reminded me of my purchase was all the mania about the Raspberry Pi which is a very similar and even cheaper computer to the Dreamplug. Several of my friends in the Chelmer Linux & Android Users Group have the Raspberry Pi on order so I feel it would be useful for me to get familiar with ARM based Linux again.

As I am resting between jobs at the moment I thought I may as well pass the time trying to do something specific and useful with the Dreamplug.

I am on Virgin Internet here and although I am pleased with the service most of the time the router I was provided with has an annoying habit of locking up and needing to be reset every week or so. I got them to replace it and the new one has the same vice so maybe the problem lies deeper? As the router is not something I am allowed to poke around in side my options beyond 'turning it off and on again' were a bit limited.

The Dreamplug has two Gigabit Ethernet ports and with the Ubuntu OS that comes with it also supports operation as a wireless AP and bluetooth too. As my Internet is only a 30mbit service it seemed wasteful to use one of the Gigabit ports for that so I have decided to use a USB to ethernet adaptor that I happened to have lying about - a MosChip MCS7830 for the external connection. This frees the two gigabit ports up for connecting the living room PC (as that is where the TV company cable modem has to be situated) and the wiring going to the office.

I spent a few years in the past working for Smoothwall so I am after something that is similar to their Smoothwall Express product, which as yet is only available for Intel technology PCs, not ARM.

What Smoothwall Express provides is a firewall that runs on an old PC that provides the following extra features:
  • Simple IPSEC VPN
  • Traffic shaping
  • Dhcp server for local network
  • Web proxy
  • lot of other stuff
  • All with a web UI.
I ran a real Smoothwall in the past on an old Celeron based PC. However with electricity costs going up every year it would be cool for something that uses a tiny fraction of that power to be made to do the same job.

Going the whole way to an ARM based Smoothwall would be a big undertaking but I can at least do some of the preparatory work to allow such a thing to be made on a Dreamplug, or perhaps even more interesting, on a Raspberry Pi.

The Raspberry Pi is very cheap, but use as a firewall would require it to use two ethernet interfaces so would need a USB to ethernet adaptor. The ethernet onboard is already apparently provided via the USB bus so adding a second one would not be that hard. I do not yet have any Pi hardware so I am going to concentrate on getting this working on the Dreamplug first.

Please treat these blog posts as a work in progress, not as a finished 'howto', a lot of it is just me thinking out loud about what is needed to get from Point A to Point B.

Day 1

As the Ubuntu that comes in the Dreamplug is rather old (a customised version of 9.10 Jaunty Jackalope) My fist step in this was to take advantage of the Debian Squeeze SD card images that are available for download from the NewIT website. Although I had an 8GB SD card I found that when uncompressed the 8GB SD card image was too big. I followed the advice of others and used the 4GB image instead. This seemed to work just fine and I created another partition for /home on the rest of the card. At the moment you cant seem to just boot ARM systems from a CD and build from there. The collection of assumed hardware that makes up a PC just cannot be assumed. The Linux cannot be 'installed' as such on the target system, it has to 'pre exist' The way to do this is take a compressed image and using another bigger Linux system extract it onto the SD card. The end result of this is a ready to go SD card with partition table in place and everything in a default primordial state.

The root password for the standard NewIT Debian images is nosoup4u, so the first thing to do is change it to something else! There is no security problem with me relaying the default password to you at this stage is that the standard image does not even have sshd installed (so you cannot log into it over the network anyway). After changing the root password then you can update the what now will be rather old packages and then install sshd.

The basic command line tools will have to be used to do this:

apt-get update - to update the repository data.
apt-cache search sshd - to tell us what the packages related to ssh would be called.

apt-get install openssh-server - to install it.

Reboot the system and observe the dhcp granted IP address as your Dreamplug comes back up and you will be able to start connecting over the network rather than via the jtag serial console.

The next thing I did was apt-get install synaptic - to get the more familiar GUI tool for package selection.

This brings in a lot of other dependencies so in one operation brings the very skeleton initial disk image up to a much more complete spec. I also added samba and the tomcat web server (something I want to learn about and an ARM system should be fine with running Java - just look at all those Android phones!) at this stage. Disk use still only  under 1G so plenty of free space on the 8G SD card.

One big disadvantage of SD cards is that their write speed is still quite slow. Installing a large number of packages can take a LOT longer than on a bigger disk based system. Be patient and go get a drink or two! The Dreamplug has an Esata port though! Once I have the basics working I have a 2TB Esata disk to add as some NAS disk storage. However making sure the basic system all works in the confines of an SD card is good discipline, even though it can sometimes be slow.

After installing this first tranche of packages I did a reboot to confirm that the system still comes up cleanly. A good tip is that at regular checkpoints e.g. a day of working on it, it is a good idea to take a compressed disk image checkpoint of the SD card from your ARM system in a bigger machine. Writing SD cards takes a while, but reading one to back it up should be nice and quick and redoing an image of where you had got to yesterday is a lot better than retracing all the steps!

The set of goals tomorrow (or whenever I get the time to play with this) are as follows:

  1. The USB to ethernet adapter I had is seen by lsusb but NOT automatically recognised as an ethernet port - I need to find out how to bridge this gap.
  2. The WiFi that works under the Jaunty Jackalope image does not seem to be detected by default in the Debian image. This needs to be sorted if I want to use my wireless devices on the Internet anymore.
  3. Get a Smoothwall build environment set up and look into turning it into a cross compiler for ARM - so that the full suite of tools can be built for ARM not Intel. This will probably take much more than a day!
  4. Learn a lot about Debian as it exists without all the flashy Ubuntu GUI!





Monday, 26 March 2012

Oh for one kernel that does everything.

I am generally very happy with Centos 6.2 as an OS for the little Lenovo Ideapad. Everything seems to work, I even have a virtual machine to run FC16 running on top of it.

A net-book that is powerful enough to run virtual machines? Yep!

I also tried to make a virtual machine to try to get the Windows 7 that was the native OS originally. I sized everything down a bit but in particular made a 'Compaq diagnostics' type partition to hold the recovery copy of Winows 7 that I had previously used with great success to restore Windows to the machine after the first foray with Fedora Core 16. In the case of the 'bare metal' restore there was the magic reset button on the machine to press. I guessed that making the special partition bootable in the virtual machine would have the same effect. I tried but just got a frozen screen with not even the hint that anything was going on.

The licence that came with Windows 7 indicated that running under a virtual machine was permitted but without any practical path being apparent to remove its role as the primary OS. Oh well, looks like the only thing I will every be able to do with the copy of Windows I have paid for is to restore it to the PC when I finally come to sell it. I would hope by that time however most people will have come to realise how much better a PC is when running Linux so the need to restore a by then out of date Windows OS should not even arise.

The only gripe that I have with CentOS 6.2 is that it steadfastly refuses to work with my 3G modem stick. I have a MF112 stick from Three that works flawlessly with every version of Mint and Ubuntu I have tried and even an old Fedora 14.  The modem is seen but the connection collapses with a -110 error. Very annoying.

As a little experiment I put a Fedora Core 16 kernel onto the system as an extra boot option. It works (mostly) - in that the 3G modem works (which proves that all the NetworkManager/ModemManager userland side is good in CentOS) but the dratted Wifi does not. There are also some issues with sleep mode not working and SELinux wanting to scan the whole filesystem upon ny return to CentOS  just in case the alien kernel had been up to anything naughty! These issues may be resolveable by going further down the 'Frankenstien' route than just having alien kernel images and modules but I did not really want to end up with an unmaintainable mess!

In spite of the rough edges being able to take a kernel from a newer distribution and being able to demonstrate that particular behaviours are only to do with the kernel version - 3G working, Wifi not, means that the bugs can be reported to their respective kernel maintainers.

In both directions the bugs are annoying. As I mentioned my 3G modem works just fine in Fedora 14 which is a couple of years old now. The support code for it should have made it into CentOS 6.2 surely?

The particular WiFi hardware that the Ideapad uses not working with the latest Fedora and Ubuntu releases is even more worrying as it represents a regression failure. That is to say a feature that was working suddenly breaks just because a new release of software has come out. Regression errors make us look bad. Unless the hardware concerned is truly archaic there really is no excuse for hardware to work in one kernel release and break in the next.

Wednesday, 14 March 2012

More on the Ideapad

Ok, the saga of the Lenovo Ideapad, part 2.

I have had this rather nifty netbook for nearly a month now. I know that because the AV software is starting to bitch about the free trial running out.

Stopping the bitching is going to cost me £40 a year! That is a large proportion of what this little computer cost, and for fixing things that should not have been a problem in the first place. I will pass on that I think. It means I will not be able to ever connect Windows to the Internet again but that is a small loss. There is nothing I do on-line that Linux cannot do for me already.

You may recall that after my first experience in putting Linux on the system I admitted defeat and restored Windows again? This was just because I could not get Linux to see the WiFi. An annoying bug in that the driver seemed to be loaded but adamant that the interface was disabled.

Partly because I wanted to see if the wifi was still working I reinstalled the Windows by putting the special hidden service partition back on and pressing the special 'recovery' reset switch the Ideapad is equipped with. This rebuilt the rest of the partitions and returned the computer to the same state it first arrived in. A handy trick that if I ever want to sell the machine on to someone who cannot be convinced that Linux is a better choice.

At first the WiFi refused to work for Windows too! Then I discovered that there are two distinct ways to disable the WiFi, a slider switch which I did know about and a Fn key combo which I did not. Once I had used this keyboard combination the WiFi started working in Windows much to my relief.

This left me in the position where Windows once again had taken over the whole disk with no room for Linux. With 300GB of space this is just crazy. My previous netbook worked just fine with 16GB internal disk and a 16GB SD card so 300GB is a big step up and plenty of room to keep a small Windows partition for the occasional 'only works on Windows' job. I found that Windows now comes with a tool for partition editing so I used this to shrink the big windows partition down to just 40GB. This is where the weirdness started. The old DOS compatible partition table was laid out with a small Windows boot partition as sda1, then the big Windows partition as sda2 then an extended area as sda3, then the special hidden partition as sda4. As the area of the disk for extended partitions was only in the area of disk covered by sda3 it was not clear to me how the new freed space would be treated. Windows seems happy that it now has an extra partition but using the tool has left me in a position where Linux no longer accepts that the disk has a valid MSDOS (i.e. primary) partition table!

Booting Linux from a USB key and using the gparted tool gives more information. Some of the partitions are seen but backup copies of the table are not in the places they should be etc. Windows itself seems to work normally but using the provided repartitioning tool seems to leave the disk in a mess when it comes to introducing new resident OSes. I wonder if this is deliberate or just incompetent? Just a warning about using the Windows own tools to do resizing, you could end up with something of no use to anything but Windows!

As the main partition table is not there anymore I am not even sure that the recover to factory settings option will work without re-creating the original partition table by hand.

The WiFi support in Fedora Core 16 is a bit of a mystery still. When booting from the DVD the wireless is recognised and in use for fetching updates during the install. Changing to a console and using the lsmod command confirms the particular driver modules in use:

ath9k
mac80211
ath9k_common
ath9k_hw
cfg80211
rfkill

The odd thing is that when the installed OS is booted the WiFi stack no longer works - we are back to the 'WiFi is disabled' gripe.

I then tried a different tack. What I really need is something as close as possible to RHEL that I can carry round with me. So I tried the Centos 6.2 live DVD - it saw the WiFi first time.

By this time my months free trial of AV had just about run out so it was time to say goodbye to Windows forever and do a complete re-install. With the whole disk given over to Centos with 4GB of memory as standard it will be possible to run virtual machines to get Windows to run later I hope. Centos also has the advantage in that it stays with the Gnome 2 user interface for now. Gnome 3 may look very pretty for this tablet age but I prefer a bit of simplicity.

Centos has a big advantage in that it is binary compatible with Red Hat Enterprise Linux. All the packages are built from the same Open Source source-code that Red Hat uses. This means that although you will be a little behind the 'bleeding edge' of what is new and funky in Linux you will at least have skills that directly transfer into the workplace.

Will I miss Windows? Not at all, it turned out that the ONLY thing I was using on a regular basis was the Google Chrome browser, which is also available, and looks and works virtually the same, on Linux. Why put up with all the downsides of Windows just to be a platform underlying a browser.