Reading a technology bulletin I found a piece of news without waste. I've read here that DARPA pretends to make supercomputers that can execute 50 GigaFLOPS per watt of power consumed. Any progress in this field is interesting but even more if they arrive to their target power consumption because, for me, it means researching in more power-efficient processors and the consequent improvement in day-to-day power consumption, specially for increasing the battery life of laptops.
I'm very happy to announce that on April 1st I started to work with a research fellowship in the group E096-02: Intelligent Systems and Telematics of the Department of Communications and Information Engineering of the Univesity of Murcia.
My research lines are: cluster management systems, network management/configuration and telematics services.
Here is my institutional web page.
GT Solutions is a company from Malaga (Spain) that sponsorized the Open Source World Conference Malaga 2008 and presented EuroGes, a business administration solution, that they claim to be under GPL, but it is a fake.
In their downloads page there are only binary packages and, when someone asked them for the sources, they wanted to be paid to provide them. As far as I know from that license, a GPLed product can be selled, at least in binary, but getting the sources should not add cost to the software. If they let people get the GPLed software for free (as free beer) its sources must be free too.
The government of my country has decided that Computer Engineers are useless because computing is a cross discipline and all the other careers should adopt it so it has removed Computer Engineering from the official degrees of Spain.
This issue is going to bring big consecuencies to the professionals that currently own a degree because they will never be recognized as "computing professionals" and everyone can be contracted for do that. Also, a frustration comes to my head when I think a future without a common and centralized degree of Computer Science... Spain is going to be a brain importation country.
Mono 2.0 has been released. It is a major release with a lot of bug fixes and new features. The main changes are the compiler upgrade to C# 3.0 with LINQ and the inclussion 2.0 version of ADO.NET, ASP.NET and System.Windows.Forms, which is more than enough for a 2.0 release. Please, go to Mono site to read more.
With this release, I hope Mono could attract a lot of new contributors and enterprises that depend on .NET and want to migrate their systems to Linux.
I encourage any .NET developer to try Mono 2.0, it has the actual heart of multiplatform and multilanguage development. Moreover, this release should be included in any Linux distribution, I hope this release is in time to be included in Ubuntu 8.10 (Intrepid Ibex) and soon to be backported to other distributions that has been released yet (Thanks Meebey to always let us have Mono in Ubuntu & Debian so soon after their release).
Death Magnetic, the latest album from Metallica, is out and my girlfriend have buyed a copy for me (as a gift).
I was exciting to hear this new record and should admit that is like fresh air, the songs are much more worked than St. Anger and remember the full career of the band with simple but strong compositions and catchy rhythms. It is very pleasant to hear guitar solos again and the instrumental song needs an special attention (near twenty years have happend from the last one).
Metallica is going in a very interesting direction. Nowadays, an album like Death Magnetic is impressive.
After GUADEC, I've seen a few reactions about GTK+ future (GTK+ 3.0) and after reading a few posts I decided to write about it. Here I leave my grain of sand in this topic.
I've seen two main points of view about what should be GTK+ next major release. In one hand we have the community proposal: braking ABI and API to get a better base to add new features. In the other hand we have the ISV-cared proposal: ask ISVs and collaborate with them to get the most useful changes.
The first point I see in this matter is that we can always maintain "2.x" branch in stretch collaboration with ISVs and, at hte same time, have a 3.0 branch with a lot of improvements, new features and changes of ABI/API. Yeah, I know, this is hard and discouraged because of wasting human resources, but it means a new field for new communities and enterprises. Why certain ISVs don't spend their money in maintain FOSS that is strategic for them?
Other point is that "3.x" should be very different than "2.x" and provide a heap of strategic new features to get a powerful base software (library) that could be used in long term. I'm agree with Miguel de Icaza in this point.
Summarizing, my opinion is that there shouldn't be bad to break ABI/API in "3.x" because then there are new needs (for example, maintain "2.x" branch) that new communities or enterprises can supply.
In the last few days, just before GUADEC, have appeared a very replied post about GNOME Decadence. Trying to provide a guide about life cycle of a project I've thought about the year that the software is mainly written could be good for taking into account its status.
I've written a quick and dirty script that gets the LOCs written by year and I've obtained that mark for XSP project from Mono repository.
In the output for XSP we can take account that the software grow very fast from 2002 to 2005, having in 20003 the top number of changed lines in this period. It seems that in 2006 the software was mature enough to get off the focus on it and the changes decreased a lot. Later, in 2007, there were a lot of changes and it's an strange situation because the number of changed lines in this year was near the sum of the number of changed lines of the other years... If in 2008 the number of changed lines continued growing, we could talk about the "rebirth" of the project but it isn't, the number of changed lines could be around the same value that in 2006.
This analysis determines that either XSP is mature enough to get changed a lot or is in "decadence" (without new features). Well, we should know that XSP is the implementation of a web server interface for System.Web and there aren't many fields to improve or innovate.
It could be great to see numbers of this kind for other software projects. I'll get some more in a couple of days.
Here is the script I've used:
for file in $(svn list -R $URL); do
svn blame -v $URL/$file
done > $LISTINGFILE
cat $LISTINGFILE | sed "s/^ *//g" | sed "s/ */ /g" | cut -f 3 -d " " | cut -f 1 -d "-" | sort > $TEMPFILE
for i in $(cat $TEMPFILE | uniq); do echo -n "$i: "; grep $i $TEMPFILE | wc -l; done
I know that "LISTINGFILE" is not needed but I've included it because the "for" instruction with "svn blame..." is very slow and goes through the entire repository, it's good to have a "cache". Maybe TEMPFILE can be taken out but, as I said, it's a quick and dirty script.
Yesterday I saw in the news that New York Times has published an article about my region (Murcia) that expose our needs of water.
I need to clarify that the problem is not so big and there is a political campaign to convince society that the problem is so big and they need to fight for water. Environmentalists talks about water savings, stating that it is good for the nature, but we need to know that it is false. The cycle of water is very quick and "wasted" water finally goes to the sea... the "begining" of the cycle. In the background, and maybe without their knowledge, environmentalists are only supporting golf fields business, the main consumers of water in my region.
Tomorrow (May, 25) is Towel Day and to participate in the event I wrote a silly python script:
python -c "l=[ord(c)-ord('a') for c in 'linux'];print l+l+l//l+l"
My idea was to get this result with recursion from any word passed to one function... it was impractical.
And don't forget that Forty Two is the Answer to Life, the Universe and Everything.