Thanks to Alex Graveley for linking to a very interesting new research result from Ed Felten and others, explaining that encryption keys can be easily retrieved from the memory of a running system by power-cycling it. Contrary to what most people think, it is possible to retrieve almost all data from a DRAM chip several seconds or minutes after a power cut. Many companies (including the one I work for) require hard disk encryption for all laptop computers in order to ensure that any sensitive information stored on the machine cannot be retrieved even if the machine is stolen.
However, the report published by the Princeton researchers shows that if the machine is running or is in suspended mode, then an attacker can steal it and get both the encrypted hard disk and the decryption key. This key must be stored in the RAM of the running system so that it can access the files on disk. The attack consists in briefly removing the power from the machine and rebooting it using a small program that will save the contents of the memory to some external storage. Once this is done, the hard disk encryption key can be retrieved from the saved data. Some machines have a mechanism that clears their memory after a reboot (this is often the case with ECC memory). But even in this case, it is also possible to retrieve the decryption key by cooling down the memory chips, removing them from the machine and inserting them into another machine that will extract the valuable information.
This is a serious problem for anybody who relies on hard disk encryption for protecting confidential data: an attacker who has physical access to the machine (even for just a brief moment) may be able to retrieve the decryption key and get full access to the contents of the disk. Leaving the machine unattended in suspended mode or with the screen locked may be the same as leaving it fully open.
There are not many ways to avoid this problem, besides preventing physical access to the machine or using some software or hardware self-destruction mechanisms in case the machine is tempered with. If the machine is suspended, the research paper (PDF) explains that it may be possible to clear or obscure the key before suspending the system so that it cannot be retrieved easily. The user would then have to re-enter the disk encryption key before resuming the system, or enter a password to decrypt that key. This is not trivial to implement because the system cannot read any information from the encrypted disk until the user has entered the right password, so all software needed for entering passwords and setting input and output devices to a known state must be available before the system is resumed.
It is not possible to implement the same protection when the screen is simply locked, because there will usually be some software that wants to access the hard disk while the screen is locked. The paper describes a way to make it slightly more difficult to retrieve the key from RAM: if the system does not need to access the disk for a while, it could scramble the key (in a reversible way) and spread it over a larger area in memory in such a way that a single bit error over the whole area would make the key unusable. As soon as the key is needed again, it is reassembled and used until it is not needed anymore. This can provide some limited protection because the cold boot attack does not always get a perfect copy of the RAM. But even with this additional level of protection, it looks like a locked screen is a very weak protection against data theft.
Following the link from Luis’ blog, I discovered the mini-site about Vernor Vinge‘s novel “Rainbows End” (winner of the 2007 Hugo Award).
Besides a full text version of the book, that mini-site also contains some simple illustrations made by the author. Looking at the bottom of the “Outer” image, I saw the following copyright notice: “Vernor Vinge, 2006 (Using the GIMP)”
This is not entirely surprising, considering that he is a friend of Free Software and even a member of the award committee for the FSF Award for the Advancement of Free Software. But still, it is nice to see a well-known SF author who is also a GIMP user. As a GIMP developer and SF fan, this made me happy.
I doubt that Vernor Vinge will ever read this blog, but for the next illustrations I would recommend using Inkscape for the line art and text. This would lead to better results than using GIMP alone.
After reading Xan’s article The Cyclomatic Horror From Outer Space analyzing the complexity of some GTK functions, I was curious and I wanted to run the same test in the GIMP source tree in order to see what parts of the code would be the hardest to test. This test is very simple and can be summarized as counting the number of decision points in every function in a program (so you get an idea of the number of possible code paths).
I did as suggested and I started with “
apt-get install pmccabe“, followed by “
pmccabe app/*/*.c | sort -nr | head -10” to get the 10 functions with the highest (worst) results. This gave me the following table:
||Lines of code
According to the CMU page on cyclomatic complexity, numbers between 21 and 50 reveal a “complex, high risk program” and numbers above 50 only occur in an “untestable program (very high risk)“.
What does this mean for GIMP? Not much. But if you touch one of these functions, please be careful… you might break things and it will be very hard to find where the bugs are hiding in that code.
Some GIMP users who follow tutorials written for Adobe Photoshop are sometimes confused when they see statements like “Save your image using quality 8 or 9 in order to get good results” because this obviously does not match the scale from 0 to 100 used by GIMP (and other software based on the IJG JPEG library).
While working on some improvements for GIMP’s JPEG plug-in, I investigated the compression levels used by various programs and cameras. The analysis of several sample images allowed me to build the following mapping table (slightly updated since I posted a similar message to the gimp-web mailing list in August):
- Photoshop quality 12 <= GIMP quality 98, subsampling 1×1,1×1,1×1
- Photoshop quality 11 <= GIMP quality 96, subsampling 1×1,1×1,1×1
- Photoshop quality 10 <= GIMP quality 93, subsampling 1×1,1×1,1×1
- Photoshop quality 9 <= GIMP quality 92, subsampling 1×1,1×1,1×1
- Photoshop quality 8 <= GIMP quality 91, subsampling 1×1,1×1,1×1
- Photoshop quality 7 <= GIMP quality 90, subsampling 1×1,1×1,1×1
- Photoshop quality 6 <= GIMP quality 91, subsampling 2×2,1×1,1×1
- Photoshop quality 5 <= GIMP quality 90, subsampling 2×2,1×1,1×1
- Photoshop quality 4 <= GIMP quality 89, subsampling 2×2,1×1,1×1
- Photoshop quality 3 <= GIMP quality 89, subsampling 2×2,1×1,1×1
- Photoshop quality 2 <= GIMP quality 87, subsampling 2×2,1×1,1×1
- Photoshop quality 1 <= GIMP quality 86, subsampling 2×2,1×1,1×1
- Photoshop quality 0 <= GIMP quality 85, subsampling 2×2,1×1,1×1
The quality settings in Adobe Photoshop include not only the compression factor that influences the quantization tables, but also the type of chroma subsampling performed on the image. The higher quality levels use no subsampling, while the lower ones use 2×2 subsampling. The strange transition between Photoshop quality 6 and 7 (quality 6 having a higher equivalent IJG quality than 7) can be explained by the difference in subsampling: since quality 6 has less color information to encode, the size of the file will be smaller anyway, even if more coefficients are preserved in the quantization step.
You may also be surprised by the fact that the default GIMP JPEG quality level (85) matches the lowest quality offered by Photoshop: quality 0. This makes sense if you consider that the default “Save” offered by Photoshop is designed for high-quality images, so the losses should be minimized. But if you want to save images for web publishing, then Photoshop has a separate “Save for Web” feature that can save images using lower quality levels:
- Photoshop save for web 100 <= GIMP quality 98, subsampling 1×1,1×1,1×1
- Photoshop save for web 90 <= GIMP quality 96, subsampling 1×1,1×1,1×1
- Photoshop save for web 80 <= GIMP quality 93, subsampling 1×1,1×1,1×1
- Photoshop save for web 70 <= GIMP quality 90, subsampling 1×1,1×1,1×1
- Photoshop save for web 60 <= GIMP quality 85, subsampling 1×1,1×1,1×1
- Photoshop save for web 50 <= GIMP quality 86, subsampling 2×2,1×1,1×1
- Photoshop save for web 40 <= GIMP quality 79, subsampling 2×2,1×1,1×1
- Photoshop save for web 30 <= GIMP quality 74, subsampling 2×2,1×1,1×1
- Photoshop save for web 20 <= GIMP quality 70, subsampling 2×2,1×1,1×1
- Photoshop save for web 10 <= GIMP quality 60, subsampling 2×2,1×1,1×1
This mapping between Photoshop and GIMP quality levels for JPEG is not exact and is intentionally pessimistic for GIMP. There is some safety margin, so it is possible to decrease the GIMP quality level a bit and still get a file that is as good as the one saved by Photoshop.
Reminder: if you think that you will need to re-edit an image later, then you should never save it only in JPEG format. Always keep a copy in XCF format (GIMP’s native file format) so that you can edit it without losing any additional information.
Another reminder: using a JPEG quality level below 50 or above 98 is not a good idea. Saving a JPEG image using quality 99 or 100 is just a waste of disk space. You should use a different file format instead (such as PNG or TIFF). And below quality 50, you lose so much that it would be better to rescale your image and use a lower resolution before trying to compress it further.
Last Wednesday, I went to the gas station because my car was a bit thirsty. When I wanted to insert my card and pay for the fuel, I was greeted with this ridiculous error message: “The exception unknown software exception (0x0eedfade) occurred in the application at location 0x77e73887.”
This is so wrong..
- The error message goes to the wrong target: the customer cannot do anything about it anyway, so why does it appear on the screen? The touch screen was frozen so I could not even press the OK button. In cases like this, the software should just log the error and blank the screen or display some customer-oriented error message such as “Out of order”. There should be a way to trap these errors (any kind of software error) and redirect them to the company that maintains these terminals instead throwing them at the customer.
- The exception “unknown software exception” shows that things are definitely not under control. How can one trust a system that displays such a stupid error message?
- Minor detail: the error message is in English only, while the user interface of this terminal defaults to French and supports multiple languages (Dutch and German, but not English). Trapping the error and displaying “Out of order” in multiple languages would have been more appropriate and more customer-friendly.
- If you ask Google about this error message by searching for the error code and address, you will find several matches revealing that various applications are affected by this random crash: Internet Explorer, Photoshop, some Delphi applications and other specialized software. This looks like a mysterious Windows crash that confuses everybody.
- Using Windows instead of a more robust embedded operating system is just asking for trouble. The main advantage may be that some customers are already familiar with the Windows error dialogs and can recognize them from a distance, so they know that they should go away and not even bother reading the error message.
Last month, I was driving a bit fast towards Brussels because I didn’t want to miss my plane. It was raining heavily, but fortunately there wasn’t too much traffic on the motorway.
Suddenly, I see the three cars in front of me slowing down very quickly and switching from the first to the third lane. Oops! The road is covered with sugar beets and some of them are still rolling… Fortunately, I was paying attention so I quickly stepped on the brakes and then moved to the third lane like the other cars. Once I had slowed down enough, I grabbed my mobile phone and took this photo through the windshield without aiming much because I still had to be careful and avoid the other cars… not to mention slaloming to avoid the rogue beets trying to attack my car.
That was an interesting driving experience… I eventually reached the airport safely and got my plane just in time.
The latest draft of the GPLv3 contains many improvements over the previous ones. It also still contains several minor issues, some of which date back to the first draft. Among these, there is a paragraph that remained unchanged since the first draft, although there were several comments saying that it could provide a loophole:
The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source.
The problem is that “automatically” is not defined and it could lead to abuses, including preventing users from running modified versions of GPL software on some devices (the Tivoization problem that GPLv3 tries to prevent). “Automatically” can cover current practices such as generating
Makefile.in using autoconf, generating
parser.y using bison, etc.
But “automatically” could also include some operations that are impractical in terms of time or special equipment required. A file that can be regenerated automatically but requires several hundred years of computation on a supercomputer will effectively prevent most people from compiling the software and installing it on their device (if that file is required during installation or during run time). The canonical example would be if the tool that regenerates the missing source file requires the factorization of the product of two very large prime numbers.
As long as the company selling the device provides the complete Corresponding Source (including tools necessary for regenerating the missing files) and Installation Information, then they would be compliant to the GPLv3. As long as the source code (with the missing file) is the “prefered form of the work for making modifications to it”, then they have followed the GPL to the letter… while still preventing users from running modified code on their devices.
Of course I reported this problem and I included links to the previous comments on the same issue. But it looks like this issue has been ignored so far, despite the fact that the comments on the first draft are more than a year old.
I bought my current camera (Nikon D70) in May 2004, just a few days after that model started to appear in shops. I made several thousand photos with this camera and I have been happy with the results… Until a few months ago, when it started behaving in a strange way: sometimes the camera would take unusually long to power up, sometimes it would appear to work correctly for a while but then suddenly start to display bogus exposure values. Without moving the camera or changing the lighting conditions, it would randomly change between 1/500s and 2 seconds, or some other very different values. When the camera started to behave like that, it was almost impossible to shoot anything with the correct exposure.
Over time, this problem became more and more severe so I eventually decided to bring the camera back to the shop for repair. My camera was 3 years old, so the warranty had already expired. The guy at the shop asked me to pay 50 EUR in advance and told me that he would send it to Nikon for repair, without expecting too much: regardless of the brand and model, many cameras are just returned because repairing them would cost more than a new model. This is a bit annoying when the camera costs around 1000 EUR.
Two weeks later, I got a letter telling me that my camera was back in the shop. The letter did not mention any repairs, so I was a bit worried. I went to the shop to pick it up and I had a pleasant surprise: Nikon replaced the defective circuits at no cost. When I tested the camera, I found that they also upgraded the firmware to a newer version, which includes new features and fixes some bugs that I had encountered before. And they also cleaned up the camera (inside and outside). So this is a happy end. Thanks to the Nikon customer service!
Sophie was born 6th of September, a bit before 16:00. She is not a small baby, weighting 4.390 kg and measuring 55 cm, but she did not take too long to come out and greet her happy parents Isabelle and Raphaël. A couple of hours later, she met her sister Catherine who is now two years old. The baby and mother are doing fine. The main job of the father was to hold the mother’s hand. This very important job was accomplished successfully according to the mother.
The picture below shows Sophie less than 4 hours after her birth. She was about to ask for more food…
At the end of last year, my wife Isabelle asked me if I could create some postcards for the association she works for. The idea was to have something that can serve both as a greeting card for the new year and an indication that they have recently moved into new offices. After a couple of iterations with POV-Ray and GIMP, I eventually managed to create something that she and her colleagues were happy with. I created a winter scene showing their front door (modelled from photos), some cardboard boxes in front of it and some trees in the background instead of the ugly wall that is there in reality.
Later, she asked me to create some posters showing different versions of this scene. I re-rendered the winter scene at a higher resolution (approx. 3000 * 2000) and I also created a summer scene in which the snow is gone, the grass is greener and there are leaves on the trees. Here are some smaller versions of these images:
I gave Isabelle a CD-ROM containing these images and she went to the cheapest print shop nearby to get them printed on posters (100 * 70cm). At that size, the resolution of the images is 75dpi. I thought that it would be good enough for a poster that is designed to be viewed from some distance.
Well, it turns out that the employee of that shop discouraged Isabelle from printing the posters because they were not at 300dpi. She claimed that the images would look bad because of the low resolution. On the other hand, she looked at the samples that I had given to my wife (printed on photo paper) and she said that she would not be able to get the same quality on posters. Go figure. Oh, and she also said that her computer could not read that strange PNG format, but fortunately I had also included a JPEG version.
Anyway, she asked for 300dpi so she will get them. For 100 * 70cm, 300dpi means about 12000 * 8000 pixels, for a total of 97 megapixels. I wish I had a camera that could take photos at that resolution. I estimated that re-rendering the image with POV-Ray would take a couple of weeks using my fastest computer (which just died anyway – see my previous entry), so I simply let GIMP scale up the image and I only recreated the parts in which the details could be interesting. In the process, GIMP asked me if I really wanted to create an image that took more than 1 GB. Well, sure, that’s what the shop asked for! I will burn these files on a CD now. I hope that they will not complain that the image is too large for their computer to handle…