Overcoming major bottlenecks in graphic editing

When going from draft to final you jump from editing small documents and image files to building up huge press-ready files. With most text documents this is not a issue but with images then these can quickly leap in size and your old desktop or laptop, which seemed quite fast for regular web based work, becomes unusable.

The hardest page to edit is the cover of a book. If you have commissioned an art work and have got it scanned at a print-shop then you’ll be the proud owner of a huge TIFF or PNG or similar scanned image of 600 or 1200 dpi so say 6000×4000 pixels or bigger. Equally, images from professional cameras will be a minimum of 12 Megapixels (4000×3000 pixels).

The edits you are doing are simple – cropping to get correct aspect ratio, removing rough edges from the scan, aligning the scan, removing artefacts or layering in other images, but operating on such large files then forget trying to do these kind of edits on your old laptop or desktops and forget including these size of files in image frames in Scribus. Unless your laptop is new with at least a dual core and 2 Gigabytes of memory then it will grind to a halt. We’re using GIMP 2.6 so it may be that other programs are more memory efficient and it may be that you think your file is small but memory use also grows with the undo history. If you are trying out ideas then having the ability to undo changes fast allows you to work better. So all programs need to be in real memory, not swapped out to disk.

So you can buy new laptops or upgrade laptop memory for everyone for this one short time they work on this book or you can try something else that is cheaper. This is what we did – we got a new single desktop-scale PC which we put in a good amount of memory and then anyone who wanted to edit large files used remote desktop to get into this new PC.

Nothing fancy: we used a regular desktop style CPU. You could buy an off-the-shelf PC (which would probably run Windows), or even cheaper you could make this yourself from retail parts and run a Linux distribution. Whatever parts you get you need a reasonable amount of memory, reasonable processor speed and lots of disk space.

You don’t need a fancy graphics card as fast graphics cards are only really useful for 3D games – the onboard graphics from the major providers (ATI/Nvidia) on new motherboards are perfectly fine for this kind of “server” use.

How much is a “reasonable” ?. Well this is a desktop class of machine so 4 Gigabytes works out as a reasonably good number to use if you are looking at 1 or two editors. On either Windows 7 or Ubuntu (desktop 10.10) this leaves around 3 Gigabytes for applications if the onboard graphics uses a few hundred megabytes. If you want to go above 4 Gigabytes then you need to either run Windows 64 bit or a GNU/Linux based 64 bit distribution.

What is a reasonable processor ? You will get very good value for money with the retail desktop processors from both Intel and AMD but for the moment the AMD processors offer the best value for money if you want to build your own. As you are intending this to be a multi-user machine then ideally you need a dual core processor. This means an Athlon II 2X or better though if you really want to try for the really low budget then the Sempron 140 processors can be unlocked on a suitable motherboard to give you two cores.

The type of processor and motherboard will drive your memory selection. You’ll find that old memory can usually be much more expensive than new memory for the larger memory sizes. If you are looking at say a 4 Gigabyte memory kit (2x2Gigabytes) then you need to look at DDR3 memory on a matching motherboard.

It wasn’t that long ago I loved to find old machines (“Trashware”) and carefully recover the machines from parts. This meant working with 10-40 gigabyte and similar sized PATA hard drives and PC133 SDRAM on Pentium III and Duron sized machines. This will prove counter-productive for this job in that new parts have many orders of increase in capacity over older equipment so much so that it is much more cost effective to buy new memory or disk drives than to try and work out how to fit a new operating system on a 10 Gigabyte partition. New DDR3 memory is a penny a megabyte for Gigabyte scales and new hard disks are nickels per Gigabyte. Certainly if you find old DVDROM drives (that cannot burn disks) then these are still useful but CDROM/CDRW drives are more or less junk with the use of USB sticks. Old MicroATX cases are still useful if you don’t mind no front-panel audio or USB ports and buying a new power supply unit. Overall playing with trashware is fun but it is not the right equipment for this kind of job – you need new parts not old parts.

For the Operating system then this depends if it is pre-built from a major supplier or if you made this. If it is pre-built then you’ve probably been given Windows 7 of some kind. If you got this made then the best value for money you have is to go with a mainsteam GNU/Linux distribution such as Ubuntu or Fedora. The problem with staying with a retail non-server edition of Windows is that it is not a multi-user operating system in that it won’t allow two simultaneous users to run a desktop from the machine. You have to buy a Windows server edition (which is what the Windows Server 2008 Foundation edition is targeted at – competing with out-of-the-box GNU/Linux for small businesses).

For our server we used the Ubuntu 10.10 desktop 64-bit edition. We downloaded the ISO from the Internet via bittorrent. This is free to do and you are allowed to do this. You can install Ubuntu on as many PCs as you like for free.

To allow multiple Windows desktops to use this server then the easiest way to do this is to install the FreeNX software on the Ubuntu machine and then install the NX Nomachine client on each Windows PC that intends to remotely access the new PC. We’re using NX Client for Windows 3.4.0-10.

Assuming you’ve added accounts onto the new Linux machine then on your Windows PC run the NX Client. The main problems you will have is a clash of the cygwin1.dll. If you have existing Cygwin based applications running then the NX Client may fail with “Cannot initialize the display service.” You may have to use the sysinternals process explorer and search for which application has cygwin1.dll loaded if closing existing Cygwin based applications fails because some (e.g. DeltaCopy rsync.exe) application has Windows explorer.exe loading its copy of the cygwin1.dll. Note that the cygwin1.dll that NX Client loads is located at a path such as C:\Documents and Settings\\.nx\plugin\Windows\bin rather than the installed location.

Remember that Unix is case sensitive so check your logon name is exactly as it is in Unix i.e. lower case usually, if you keep getting Authentication failed messages even though you are confident you are typing in the correct password.

Other than those initial teething problems you will now have a regular Linux desktop. From this you can install new applications, such as GIMP or Scribus-NG and then run them on this new machine.

Sharing data and text files from your Windows PC to the Unix machine is easy via the NX program. Go into the configuration of the NX client and under the Services tab enable the print and file sharing.

This entry was posted in Technology. Bookmark the permalink.

Comments are closed.