A new study from Oxford University has suggested that post-traumatic stress disorder can be significantly reduced by playing the classic puzzle video game Tetris. The 1984 game's visually-oriented design is believed to make it easier for sufferers to avoid flashbacks by keeping their brains occupied.
The Oxford researchers also found that verbally oriented games, such as Pub Quiz, actually made things worse.
AT Dec 2010
Become Supermarket Communities
Supermarket chain, Sainsbury's, has announced that its new website will stock a range of ebooks from early next year. This should provide some effective competition for Amazon something that the established book retailers have failed to do.
On 22nd October, Microsoft stopped all software development activity on Windows XP.
Windows XP was rolled out in 2001 and is still the worlds most popular operating system with over 60% of the market. However, Windows 7 is steadily eating into that share at the rate of 1% to 2% per month.
Microsoft will not issue any further Service Packs or feature enhancements for the operating system, but it will provide XP security updates until April 2014.
AT Nov 2010
The US organisation Forrester Research has published a report on the present state of the e-book market. This estimates that that the 2010 sales total for e-books would be $966 million and over $1 billion in 2011. These figures confirm the fast expansion of the e-book market which has been reported by several other sources.
The report points out that only a small fraction (7%) of on-line adult readers read e-books at present. Therefore the scope for further increases in sales is large and by 2015 it estimates that the total will reach $3 billion. Crucially, this estimate assumes that further improvements to e-book readers do not occur i.e. it is likely to be an underestimate.
There is now little doubt that e-books will form a major part of the book market in the medium term and may even form the dominant part in the long term. Perhaps the most important question is whether the new type of book will cause the overall book market to expand.
It is too early in the e-book development cycle for there to be any certainty about how the reading habits of the majority of people will be affected by the new option. However, figures from Amazon suggest that there may be some increase in the size of the total book market and this has been reinforced by the sales performance of John Grishams new book, The Confession. Nielsen BookScan says that the sales of this book in its first week were 160,000 hardbacks and 70,000 e-books. This total of first week sales of 230,000 must be compared to the first week sales of the last Grisham book, The Associate (223,000 in hardback). A modest overall increase ahs been generated. Considering the immaturity of the e-book market, this is encouraging.
AT Nov 2010
The potential of flexible displays to improve the portability and durability of laptops, netbooks and e-book readers is obvious. Periodically, a company announces that it has such a product almost ready for the market, but very little happens after that. The difficulties of taking a radically new product from the laboratory to the factory are large and, so far, no company has managed that for this type of product.
Existing computer/e-book displays have glass as their main structural component. Glass has many advantages, but it is a fragile material and, as a sheet, it is certainly not flexible. A Taiwanese government-funded research lab, ITRI, has developed a novel method of using the advantages of glass during manufacture, without suffering its disadvantages in the final display product.
ITRI has developed a polymer material that can be sprayed onto a glass backing and that maintains its properties while the rest of the display layers are being deposited on it.
When the displays have had all layers deposited, the polymer substrate is simply peeled off the glass to give a light-weight, flexible and durable product . In order to allow separation of the substrate from its glass backing, ITRI also developed a release material that is sprayed on the glass before the polymer.
The major manufacturing advantage of the new product is that it uses techniques which are already in use in display factories. So, the laboratory/factory transfer difficulties should no longer be a show stopper.
So far, one Taiwanese company has taken a licence for the product and other companies are showing interest. It is expected that the new displays will be in computers and e-book readers in 2011.
AT Nov 2010
from Dimensional Research; acting for the software management company,
Dell Kace; has highlighted the caution that many organisations are exhibiting
with respect to upgrading their Microsoft software.
The survey's findings on Windows operating system deployment were not surprising. Almost all respondents (92 per cent) said they planned to avoid the Vista edition and go directly to using Windows 7, the current version.
This caution also applied to the popular Microsoft Office suite. A major reason for many respondents reluctance (45%) to undertake an Office upgrade is the ribbon user interface. This was introduced with Office 2007 and enhanced with Office 2010. Any change which modifies the user interface results in a need to learn new methods of working with the software. For individuals and organisations, this means that productivity falls temporarily. A judgement has to be made on whether the long-term gains will outweigh the short-term loss. To confuse the issue, at the changeover point, there is always the possibility of changing to a completely different suite, such as Open Office and to thus save a large amount of money.
Another reason given for delaying migration to newer versions of a software package was a concern about compatibility with other software that was being used. Of course, this concern is most important for operating systems and this was one of the issues that were addressed by a new Intel report.
had already decided that the preferred Windows migration path was Windows
XP to Windows 7 and set about testing its desirability. It established
an early-adopter group of 3000 employees. This testing
group was provided with a carefully selected work flow and the tools
to accurately inventory, normalize, prioritize, and test applications,
as well as to remediate any compatibility issues found during testing.
One major potential problem was found by Intel. This was the compatibility of early versions of Microsoft Internet Explorer with Microsoft Windows 7. Internet Explorer is the dominant web browser and is of great importance to anyone wishing to communicate over the internet.
The current version of Internet Explorer is IE8 and this is what Windows 7 is designed for. Earlier versions suffer from compatibility problems. Therefore, Intels Web applications that are currently coded for IE6 may not execute properly on a Windows 7 operating system.
Intel is a high tech company and it is always focussed on the future. Its priority is to move into the future with the minimum effort. However, most of us do not have this at the top of our priority list and are quite happy to continue using what we are familiar with, even if it is old-fashioned.
This is the obverse side of the compatibility coin. As companies like Intel move on, they leave many of us having difficulties communicating with them via our legacy systems. In the past, this has been very much to the advantage of Microsoft. We were all forced to update our software eventually. Now there are real alternatives and Microsoft has to try harder to retain its lead position.
AT Oct 2010
In the USA, Starbucks have launched a new digital network. It offers customers free e-books, films and free access to some paid Web sites such as The Wall Street Journal. Obviously, the objective is to keep customers buying cups of coffee as they make use of the free content. However, Starbucks is also selling items through the site.
Free iTunes downloads will be accessible directly through the new Digital Network.
Starbucks, in collaboration with SnagFilms, plans to run monthly themed film festivals on the Network
Publishers Hachette Book Group, HarperCollins, Penguin Group and Simon & Schuster are contributing reading material for the Network. A new HTML5 reader has been developed for this and is designed to operate on laptops, tablets and smartphones.
Doesnt this look very similar to one vision of the future of public libraries?
AT Oct 2010
Once upon a time, machine translation was a joke. However, the potential advantages of being able to almost instantly obtain a translation were so huge that companies and government agencies continued to put large efforts into improving the performance of the software. Whilst it is still possible to find hilarious examples of mistranslations by machines, the overall situation has improved markedly.
For obvious reasons, Google has been at the forefront of the development activities. It has now become confident enough in its translation capabilities to think about translating poetry. Even expert human translators find this task very difficult.
As an academic exercise, Googles software engineers considered how well its Statistical Machine Translation System (see Automatic Translation below) could perform in a poetry translation task, where meaning, meter and rhyming schemes were the criteria.
The engineers found that they had to sacrifice a little accuracy to obtain the correct poetic form, but the results were quite good. One of the side effects of the work was that the software became capable of translating anything into poetry.
Of course, after machine translation, the quality of the poetry will always be open to debate. The most expert humans can provide greater insight, but there are only a few of those.
below points out the continued popularity of the Windows XP operating
system, in spite of it having been superseded many years ago. The reason
for its continued popularity is that it was/is a system which does its
job reliably. That job was defined in an era when the threats to computer
security were less sophisticated.
Older versions of Windows XP have an inbuilt flaw. When attempting to find an available WiFi network, these operating systems will initialize an ad hoc network with the same title as the last one to which they made a successful connection. If that network was called "Free Public WiFi", they will connect to any computer offering this connection. Once connected, they become very vulnerable as the new host will have good access to the computers to which it is connected.
Fortunately, there is an easy solution to this problem install Windows XP Service Pack 3 or upgrade to Windows 7, if you feel rich enough.
AT Oct 2010
In spite of the great popularity of its newest version of the Windows operating system, Microsoft is having problems persuading some computer users to upgrade their systems from Windows XP to Windows 7. These users took note of the poor performance of the intermediate version, Vista, and refused to upgrade from XP. They found that their computers continued to work well and saw not reason to regret the decision. When Microsoft invited them to upgrade to Windows 7, they asked themselves why they should pay good money for what they have found is a marginal benefit. Why not continue to work with their present system until the hardware requires updating?
To overcome this sales resistance, Microsoft has reinstituted the Windows 7 Family Pack for a limited (unspecified) period. The offer provides a copy of Windows 7 Home Premium with licenses for three PCs. US customers were able to order the pack from 4th October for $150. International buyers, including those in the UK, will need to wait until 22nd October.
If you are not persuaded by the Microsoft offer and intend to continue using Windows XP, it would be a good idea to check the version of Internet Explorer (IE) you are currently using (assuming that you are using this browser). Microsoft is, at present, beta testing IE9 and it can be expected that a full release of this version will be made in a few months. IE9 and later versions will not run on Windows XP. Upgrading to IE8 would be a sensible, if your browser is an earier version. This will give the maximum support from Microsoft.
AT Oct 2010
E-book sales volumes have been increasing exponentially over the last few years. However, this trend was abruptly interrupted in the second quarter of 2010, when a slight decline was recorded. The first quarters figure from the American Association of Publishers & the International Digital Publishing Forum was $91 million and the second quarters total was $88.7 million. Comparing first half year sales for the last four years shows just how precipitate the sales deceleration was.
US E-book Sales - $ million
Commentators suggest that the reason for the sudden change in the e-book marketplace was caused by the introduction of the agency pricing model (the publisher sets an e-book's price) espoused by Apple and several publishers ". These companies assumed that e-book prices could be substantially increased without materially affecting the sales volume. This reaction indicates that they may have been over optimistic. We need a company by company breakdown of the overall US figures to come to a firm conclusion. However, the UK can be expected to follow the same trend as that of the USA - it always does.
The provisional conclusion that consumers did not like paying $12 to $14+ instead of the previous $9.99 does seem to be plausible. The small decrease in one quarter's sales volumes is not very impressive and it is easy to underestimate the significance of the change in customer behaviour. Of course, not all companies support the agency model and $9.99 e-books are still fairly easily available. If price is the problem, sales at Amazon and other companies supporting the low price model will have continued to increase (recent Amazon announcements have indicated that this is the case, but no figures have been produced to back-up the claim). This would mean that the agency model companies have experienced a major market rejection. No doubt they will deny that this is the case and will point out reasons for ignoring the sales hiccup - perhaps statistical errors. All statistics are inaccurate to some extent, but the degree of error needed to turn a strong, accelerating positive trend into a negative trend is very large.
The agency model companies have another worry. This is the legal one. Their methods have been described as "price fixing by the back door". Retail price maintenance was outlawed some years ago and the publishers' agency model appears to be an attempt to circumvent the legislation.
All will become clear, in time. In the meantime, shop around.
AT Aug 2010
The Taiwanese firm, E Ink, makes the displays for most e-readers. So its production figures provide a good picture of what is happening in the e-reader marketplace. The company has stated that the delivery rate of its 9.7 inch display for the Amazon DX reader tripled, when the retailer reduced the price from $489 to $379. E Ink expects further price reductions from Amazon, Barnes & Nobel and Sony and, as a result, believes that the number of e-readers sold in the second half of 2010 will be triple that of the first half.
The precipitate downward trend in e-reader prices does not necessarily mean that the market segment is unprofitable for the established manufacturers and retailers such as E Ink and Amazon. In is possible for them to compensate for a lower unit profit margin by increasing sales volumes providing their marketing expertise is good enough. Clearly, Waterstones expertise is not good enough; as its parent, HMV, has taken charge of this aspect of the book retailers business.
In contrast to most of the sleepy book trade, Amazon is continually innovating. It has developed the ebook market at a faster rate than any other company and it is now rumoured that it is getting ready to move into other areas of electronic consumer goods. The major cause of the rumours is an intensive Amazon recruitment drive for a large number of hardware engineers, but it is also known that the company has been looking at buying small hardware firms, presumably to shorten the time to market of its prospective new products.
One company which failed to get its product to market on time is Plastic Logic. This company was one of the pioneers of e-readers and was developing a high end product in Cambridge. This was the Que, which had a 10.7 inch display i.e. slightly bigger than the Kindle DX. With production scheduled to occur shortly in Dresden, this was one of the few European e-reader contenders. The expected prices for the two versions of the device were $650 (WiFi only) and $800 (WiFi + 3G). The rapid decrease in e-reader prices has made Plastic Logic pricing unrealistic and the company has scrapped its plans for a launch in the near future. Instead, it will concentrate on the next generation of e-readers and, presumably, try to remove the bottlenecks in its development system.
AT Aug 2010
Amazon Beyond Kindle
In contrast to most of the sleepy book trade, Amazon is continually innovating. It has developed the ebook market at a faster rate than any other company and it is now rumoured that it is getting ready to move into other areas of electronic consumer goods. The major cause of the rumours is an intensive Amazon recruitment drive for a large number of hardware engineers, but it is also known that the company has been looking at buying small hardware firms, presumably to shorten the time to market of its prospective new products.
AT Aug 2010
John Wiley, the publisher of scientific, technical and medical journals, encyclopaedias and profess s integrated ional/trade books has launched a new on-line venture. The Wiley Online Library is now offering access to over 4 million articles from 1,500 journals, 9,000 books, and hundreds of reference works and databases.
Whilst the internet has, for a long time, offered a far greater quantity of reference material than most public library reference sections, the quality of that material is sometimes lower than available in the library. The Wiley venture will improve this significantly.
AT Aug. 2010
Sony Reader Pocket Edition is now available for £99.99. This version of the Sony range has been designed for maximum portability. It thus has a small (5 inch) screen and weighs only 220g. The downside is that it has a relatively small memory, which is capable of holding just 350 ebooks - significantly less than other readers.
The rapid fall in ebook reader prices continues.The increasing competition in the UK is having the expected effect.
AT Aug. 2010
Both Apple and Amazon have opened major new stores in the UK, but have done so in completely different ways. Apple has gone for overkill on glitz, in order to secure its products fashion items image, whereas Amazon has simply slipped its Kindle Store into the market place with only small amounts of publicity.
It is quite obvious that the two companies are targeting different market segments. Apple has its eye firmly fixed on the young and affluent those in their late teens / early twenties who have not yet acquired mortgages and children to support. On the other hand, Amazon is seeking the cost conscious major of the population.
There is no doubt that Apples strategy is high risk, but it probably did not have any other possible alternative, due to its overall marketing policy. Short-term success is fairly easy to obtain on the back of massive amounts of publicity, but sustaining it requires a continuous stream of exciting new products. Apple must know that it riding on the back of a tiger and this suggests that it is confident that it can keep the new products flowing.
Thus, we can assume that the Apple product pipeline is full and the only outstanding question is: can the company put those products into the marketplace on time? There is some evidence that it is having difficulty doing this. The technical problems of the iPad are probably more to do with inexperience in a very narrow market segment than with a rushed development. However, that cannot be said for the problems of the latest version of the iPhone.
In contrast, Amazon has built itself a large steady book & electronic goods retailing business in the UK and is in a good position to quietly grow that further. The problems it experienced in moving the Kindle to non-US markets did not jeopardise this solid base, although they did damage the Kindles prospects to a certain extent. As the Kindle becomes a more important product for Amazon, it will find itself beset with the short life-cycle problems of the computer business, but not to the same extent as Apple.
AT August 2010
The Kindle ebook reader will finally go on sale in the UK on 27th August, according to Amazon. After a delay of approximately 2 years, Amazon has managed to get its UK ebook act together and is preparing to challenge the established suppliers, Sony and Apple here.
Amazon has announced that the UK will get the new Kindle 3. This is a smaller, lighter, more power efficient version of the series. Amazon has managed to reduce the overall size by 21%, without reducing the size of the screen (6 inches). The weight of the new Kindle is 8.7 oz (247 g). It is thus very suitable for one-handed use by those strap-hanging on the London underground.
The battery life of the new model is one month with the wireless turned off or 10 days with it turned on - a vast improvement over the the iPad. The storage capacity has also been expanded to hold 3,500 books. This is very impressive, but I wonder who needs to carry around 3,500 books, especially when a book can be downloaded in 1 minute.
Amazon claims that the electronic-ink screen on the Kindle3 has a 50% better contrast than any other eReader and has 20% faster page turns than its previous models. These improvements are just what ebook users have been looking for and will be very welcome, if they are actually delivered.
The retail cost in the UK will be £109 for the Wi-Fi only version and £149 for the 3G wireless version. There will be no monthly fees for UK owners of the 3G Kindle, which will use Vodafones network.
Of course, there are still many in the book trade who believe that ebooks are just a passing fad and that the public will never forsake the paper and ink book. No doubt there were similar people within 16th century scriptoriums who believed that hand written books were just as irreplaceable. The evidence is beginning to become accumulate for a tipping point within the next two to three years. Amazon expects Kindle ebook sales to exceed its paperback sales by the end of 2011 and to eclipse combined hardback and paperback sales shortly after that in the US. Normally, the UK lags behind the US by approximately 18 months.
AT July 2010
Amazon has released some interesting new statistics about its ebooks and ebook readers showing the accelerating importance of this part of its product mix.
The company says that, over the past three months, for every 100 hardcover books it has sold, it sold 143 Kindle books. Over the past month, for every 100 hardcover books Amazon has sold, it has sold 180 Kindle books. The data covers Amazons entire U.S. book business and slightly understates the ebook/hardcover popularity ratio, as it includes sales of hardcover books where there is no Kindle edition. In addition, the free Kindle books are excluded from the totals.
Of course, it is possible for the increasing ebook/hardcover ratio to have been caused by falling hardcover sales. To avoid this suggestion, Amazon pointed out that its hardcover sales had also increased. It is a pity that that the company did not state the actual number of sales in each category, so that everyone would be able to make their own judgement on the significance of the announcement - was the hardcover increase only marginal, as is suspected?
The rapidly increasing ebook sales appear destined to continue for some time as Jeff Bezos, Amazon CEO, claimed that the sales growth rate of the Kindle ebook reader has tripled since it lowered the price from $259 to $189. This also indicates that the ebook reader market is quite cost sensitive and the normal falling price curve for electronic equipment should rapidly expand the overall ebook market.
The Kindle store now has more than 630,000 books and over 510,000 of these cost $9.99 or less
AT July 2010
We have all seen and admired the virtual page turn effect on museum, British Library, Kindle, iPhone and iPad displays. A user sweeps his/her hand across the touch-screen to get to the next screen display. This gesture produces an animated image which progressively turns the existing page to reveal the following one. It is a neat idea and the software for it has been around for many years. Now MicroSoft has made a patent application (20100175018) for the application. This seems a little late.
AT July 2010
For a long time, automatic-translation software has been considered to be fairly useless a bit of a joke. However, it is claimed that new techniques have now improved this performance. Instead of trying to teach a program the rules of language, computer scientists use large archives of online documents that have been previously translated by humans to train computers to recognise words and phrases that have been matched across languages.
Unsurprisingly, Google is using this type of software as part of its search engine activities. There is also the possibility that Londons public libraries could ease the task of integrating new immigrants into local communities.
AT July 2010
Amazon has unveiled a new version of its large screen ebook reader, the Kindle DX. This will have a lower price; web widgets and a sharper image. It is expected that deliveries of the new version will start later this month.
The price of the new Kindle will be $379, which is significantly lower than its predecessor ($489) and which also undercuts the Apple iPad.
Amazon claims that the new ebook reader will use improved display technology, to give a 50% sharper image. Of course, the iPad has a full colour display, so there is still some way to go before the Kindle can boast of an equivalent specification to the Apple device. However, if its main use is for reading text, the Kindle seems to be a better buy.
The ebook reader market appears to be developing as expected, with prices falling and technology improving at a fast rate. However, in spite of the enormous success of the iPad, there is still a long way to go before a definitive standard is reached.
AT July 2010
Toshiba has announced that it will begin delivering a dual touchscreen laptop computer this summer. The Libretto W100 will have two 7 inch screens which can be used in the same way as a printed book. Thus, there should be little difference between reading an e-book with the Libretto and reading a small paperback.
In addition to the e-book use, the computer can be used in the same way as a normal laptop. One of the touchscreens can become a keyboard for this type of use. Of course, there will not be any touch feedback, as with a normal keyboard.
AT June 2010
A new family history website is being set up which records historical views of the UK. The organisers of the website, historypin, rely on the users uploading old photographs.
Use on the website is free due to sponsorship from several companies. One of these sponsors is Google, which is providing the online storage space for the photographs.
The old photo can be placed onto a Google Street View image, if it exists, to allow a then-and-now comparison.
Anyone uploading photographs retains the copyright to them. This has enabled several national archives, such as the Mirror newspaper groups library and the London Transport Museum, to upload their collections.
AT June 2010
The Microsoft Office suite of programs is used everywhere and has been a fantastic money-spinner for the company. This dominant position in the marketplace is now not only under attack by the free OpenOffice suite (in the desktop environment), but also by Googles Docs (on-line).
Microsoft has responded to the relentlessly increasing competition by making the most popular parts of the Office suite available on-line, without payment. The new Office Web Apps comprise Word, Excel, PowerPoint and OneNote.
The internet is as an efficient method of communication, so it is weird that Microsoft has decided to restrict simultaneous editing to only Excel and OneNote. Google has found that the ability of team members to easily collaborate to produce Word type documents is a major attraction.
There are other drawbacks to the new Microsoft offering. Word has only a rudimentary spellchecker. It is impossible to change a chart style in Excel. It is impossible to resize an embedded photograph in PowerPoint.
Microsoft is clearly very reluctant to wholeheartedly follow Google on-line and this leaves the field open to Google to develop as it wants.
AT June 2010
The previous government was committed to ensuring that everyone in the country had access to broadband services eventually and not very fast (only 2Mb/s). For the vast majority of the population, i.e. those that live in urban communities, this was a pledge with no obvious relevance, as broadband has been available there for a considerable time. There was some relevance though in the way that the government proposed to fund this provision. It intended to add £6 to each telephone bill in order to raise the necessary money. Of course this effectively meant that the urban population heavily subsidised the rural one, but this is an inevitable consequence of the commitment. More worrying was the subsidy from the urban poor to the whole rural population rich as well as poor. Those with a small income have to carefully ration their use of the telephone and the addition of a flat rate charge to each bill would have resulted in the urban poor having their telephone bills disproportionately increased.
There has, you may have noticed been a change in government and, Jeremy Hunt, the new Secretary of State at the Department for Culture Media & Sport (DCMS) has now decided that it would be better to take the required money from the BBC license fee and to aim at a more universal improvement. The very slow broadband of the previous administration has now become superfast broadband. Of course, this new target is completely open to politicians interpretation, but it would be completely ludicrous if this did not exceed 10Mb/s.
The main beneficiaries are probably still the rural population. However, the inbuilt inequality of the previous proposals has been reduced.
The money for the program will be drawn from the digital switchover budget and is available because there has been an under-spend. It is possible that the new proposals are part of the war of attrition between national politicians and the BBC, but they are still an improvement.
Anyone buying a new PC or laptop from this summer onwards, is likely to find that it will arrive with Office Starter 2010 already installed. This starter pack comprises only Word Starter and Excel Starter. They are cut-down versions of the Office Word and Excel application programs. Although they do not have all the features of their big brothers, they are good enough for most general-purpose users.
The great advantage of the Starter suite is that it is free. The disadvantage is that it has adverts included with it. It is not clear why anyone would choose Office Starter instead of OpenOffice which is free, advert free and includes drawing and presentation applications in addition to word processing and spreadsheet programs.
AT June 2010
For some time there has been talk of e-book readers and notebooks with roll-up displays. The advantages of such a device for mobile applications are obvious and now Sony has released a video of the results of its work in this area.
The video shows a 4.1 inch, i.e. quite small, screen being continually rolled and unrolled while operating. The resolution of the display does not appear to have a very high and the quoted 121 ppi confirms this impression. However, it is a colour display.
Sony definitely needs to do a lot more work on the technology before it can release it commercially, but it is clear that considerable progress has been made.
AT June 2010
It has been suggested that the reason that MicroSoft is about to issue the first service pack for its Windows 7 operating system less than a year after it appeared on the market is that it is trying to boost sales. Many businesses do not purchase upgraded operating system software until after the first service pack has been produced. In this way, the businesses avoid the teething troubles which frequently plague early adopters.
The end of official support for all but the final version of Windows XP occurs in July. Many businesses and libraries are still using this software, as they were unimpressed by the performance of its replacement, Vista. Thus, the early release of Windows 7 Service Pack 1 may well be intended to persuade XP users to upgrade sooner rather than later.
AT June 2010
For a very long time, MicroSoft has had a virtual monopoly on computer operating systems. Its Windows software has over 90% of the market and it has been able to use this fact to acquire a similarly strong position in office and browser application software. Legal action by the US government and the EU has failed to significantly change this situation.
Google, as part of its endeavour to take over the world, is now preparing to challenge Windows dominance. It has been developing its own operating system software, Chrome, for some time and it is nearing the end of its gestation period. Google is now sufficiently confident of the usefulness of the product to begin phasing-out Windows from its own computers.
Clearly, any business would not risk the integrity of its computer operations by using software containing bugs. This must be doubly so for a business which makes its money from computer applications. Thus confidence must be very high in Google.
Of course, Google will roll-out Chrome internally in a minimum risk way, starting with the non-core operations and ending with the highly critical ones only when completely satisfied. However, it cannot afford to be too slow in this, otherwise the PR aspect of the exercise will be lost. The release date for Chrome OS is expected to be towards the end of this year, so its internal testing schedule looks very sensible.
Chrome is based on the open source platform, Linux, and should offer a cost-effective alternative operating system. Initially, it will be aimed at netbook, notebook and tablet computers. These are areas where Windows does not have a complete stranglehold. In due course, Google plans to extend its bridgehead into the more demanding desktop environment. Strong competition for MicroSoft must be good news for computer users.
AT June 2010
Amazon has been keen to develop the new e-book market and has adopted a policy of selling these for $9.99. Many traditional publishers were unhappy with this approach, because it meant that Amazon demanded lower prices from them and caused their profit margins to be shaved. Therefore they welcomed the opportunity presented by the introduction of the Apple iPad to force a change on the e-book market.
The alternative model agreed by the publishing houses and Apple allows publishers to set their own retail price and to receive 70% of that price. This agency pricing model obviously stops retailers competing on price and, thus, appears to be anti-competitive (a return to the outlawed "retail price maintenance" policy).
As is to be expected, many popular e-books are now priced at $12.99 or $14.99, rather than the Amazon norm of $9.99.
This has not escaped the notice of the US legal authorities and the Texas attorney general has started making inquiries about the situation. Large publishers, such as HarperCollins and Hachette, have been approached for information.
Individual State antitrust authorities in the US are able to take action within their own jurisdiction. However, the US Justice Department is also making preliminary inquiries about Apple's practices in the music business. So, it is not unreasonable to expect this to lead to a consideration of its impact on the e-book market
AT June 2010
At the Computex trade show in Taipei recently, several new tablet computers were launched which were designed around the Windows operating system. This compensated to some extent for HPs decision to substitute the Palm system for Windows in its Slate tablet first shipments expected in October.
Asus has Launched two tablets. One (EP121) has a 12inch screen which is bigger than Apples iPad. The second (EP101TC) has a 10inch screen a little smaller than the iPad. Asusbelieves that using Win 7 will give its new computers multitasking capabilities, video conferencing support and simultaneous processing of the Word and Excel programs.HP has suggested that it will also give a less than optimum touch screen capability.
MSI has also launched a tablet with a 10inch screen (WindPad 100) which uses the Windows operating system. However, it has also launch another version of the computer (WindPad 110) running on Googles Android open source system.
LG has launched a 10inch screen tablet which is nearly identical to the MSI WindPad 100. This computer has less RAM than the MSI device, but runs on a more comprehensive version of Windows7.
The only price information available for the new computers is for the MSI EP101TC. This is reported to be priced at $399 to $449 i.e. at least $100 cheaper than the iPad.
AT June 2010
Barnes and Noble, Americas largest bricks-&-mortar book retailer, is changing its business model. It has decided that it can no longer rely on its traditional model of selling heavily discounted ink-on-paper books. After 17 years of continual growth, the company is in decline.
The chosen remedy for the B & N woes is diversity. No longer will it be a one product store. Whilst it does not close off other types of product, it has decided to try consumer electronics as a second string to it bow. To aid this change in direction, it has appointed a Silicon Valley veteran as chief executive. The new CEO has described the new business model as half technology and half retailing.
Barnes & Noble now has 1.2 million titles in its e-Bookstore. So, it has already made a good start in its new business area. Its Nook e-reader is expected to sell about 1 million of its Nook e-readers in 2010. This will probably put it in third place in the e-reader market place, behind Apple (5.5 million iPad sales expected in 2010) and Amazon (3 million Kindle sales expected in 2010).
Where the US leads, the UK follows 18 months later and continental Europe a few months after that. This does not bode well for the bricks-&-mortar booksellers on this side of the Atlantic. Their business model has been torpedoed by new technology and there is a need to plug the hole created. Barnes & Noble has recognised the problem and shifted its position in the retail market. This is not an easy thing to do, because it involves a change in corporate culture hence the change in CEO.
The implications for public libraries in the UK are obvious. Radical change is about to occur. Even in the US, the e-book market is in its infancy. Yet, in 2010, over 10 million e-readers will be sold there, from all sources. If these have an average of 200 e-books installed, there are 2 billion lost opportunities to sell a traditional book or to lend it from a library.
AT May 2010
The iPad will not become available in the UK on 28th May, as planned. The new date for the tablet to appear here is now 7th June. In spite of the iPad's immence popularity (1 million sold), there have been some criticisms of its performance and price in the USA. A comparison of the US and UK prices shows:
Flash Memory Size
WiFi + 3G
The effective exchange rate that Apple has used to translate the prices into sterling varies from 1.16 to 1.22 dollars to the pound. At the time of writing, the actual exchange rate is 1.468 dollars to the pound i.e. the UK premium is 17 - 21 percent.
The UK's operators' tariffs for connecting up the 3G iPad are approximately the same as existing broadband offerings (about £15 per month). This is believed to be because Apple has refused subsidies from the operators, in order to remain in total control of its product. This was its initial stance for the iPhone, but it subsequently changed its mind.
AT May 2010
There was a time when MicroSoft allowed its browser software, Internet Explorer, to gently fall into disrepair. It did react to security problems, but did little to accommodate changing website design requirements. Anti-competitive activities had given the company an effective monopoly in the field and it did not see any advantage in trying to keep its browser up-to-date.
Action by the EU and new strong competition from Mozilla & Google has driven the market share of Internet Explorer down from the high ninety percents to about sixty percent. This has changed MicroSofts attitude considerably. The competition is getting too close for comfort. Lethargy is no longer a viable option.
Lately, MicroSoft has exhibited a more active policy of updating Internet Explorer, but the current version (IE8) has not been able to reverse the market share downward trend. It has, however, managed to reduce the rate of decline a little (9.2% per year before its introduction versus 7.8% after).
In March, MicroSoft committed itself to updating a preview of its next version of Internet Explorer (IE9, of course) every eight weeks until it issues a public beta. This jam tomorrow ploy is aimed at professional developers, rather than the general public. By revealing new tweaks in a continuous stream, the company keeps attention focused on Internet Explorer and ensures that the developers move in-step with it.
One of IE9's innovations is its support for HTML5. The new version of the website language allows videos to be run without relying on Adobe's Flash program.
The new attitude of MicroSoft can also be discerned in its treatment of its Windows operating system. The current version, Windows 7, is a genuine improvement over its predecessor and, in the seven months since its release onto the market, it has proved extremely popular.
This is not all good news. The more frequent upgrading of basic software means that the software users are forced to upgrade as well, in order to keep their systems running properly. MicroSoft has confirmed that IE9 will not work with Windows XP. Most users do not use the full capabilities of their computers and the need for new bells and whistles is close to zero. So, upgrading a computer system often does not give a user any more real functionality. Firefox and Chrome will still work on XP, of course. So, Mozilla and Google may be beneficiaries of this lack of compatibility.
AT May 201
Windows 7 and E-book Readers
MicroSoft has announced that it will not continue the development of its Courier twin screen e-book reader. Also, HP is believed to have stopped the Windows version of its Slate tablet computer.
One reason for these changes in direction is the difficulty of using Windows 7 with touch screens. The operating system has been optimised for the use of a mouse and modifications for touch screens has not been very successful.
AT May 2010
DIY Book Scanning
Recently, Google has monopolised the attention of those who take an interest in the digitisation of books. It has enormous ambitions and almost bottomless pockets. So, there is much for the bystander to wonder at. However, Google has got itself tied up in the US legal system over its digitisation plans and has been effectively stalled for the last couple of years. Now, a resolution to the Google Book Settlement legal tussle seems likely to drift even further into the distant future, as the judge who is conducting the Federal Court hearings is due to move on to another post.
With Google otherwise engaged, perhaps it is time for attention to turn to less high-profile digitisation projects. The most venerable of these is the Gutenberg Project (started in 1971) and this has quietly built-up a very large collection. It has a European arm (Project Gutenberg Europe), so its catalogue is not purely a reflection of US tastes. This successful project uses volunteers to keep costs low. They proofread the e-books, procure eligible paper books, burn CDs for people without internet access and promote the project on their web sites. The thing that the volunteers do not do is actually scan books. For this to be done efficiently and well, it has been necessary to employ automatic, commercial book scanners. These cost up to $50,000 and are thus well beyond the means of most individuals and many voluntary groups.
A $50,000 book scanner is able to capture 3,000 pages an hour. In contrast, a flatbed scanner can take between 15 and 30 seconds to capture a single page. Thus, most people would need up to three hours work to simply capture an image of a 400-page book and then additional post capture processing time would be needed. Further, the design of flatbed scanners means that a book has to be opened wide and pressed down flat onto the platen. Obviously, this can damage the book.
The difficulties of DIY book scanning have now been tackled by several people and designs for suitable, very cheap scanners published on the internet. Additionally, software for post capture processing has also been developed/adapted and made available.
The only major obstacle which remains to more frequent (perhaps even widespread) DIY book scanning is the legal one. Legislators have produced a copyright nightmare situation. Their reaction to pressure from media companies has heavily weighted the balance against the public interest and generated much confusion. Even the mighty Google cannot escape the clutches of the lawyers. It is not surprising that the Gutenberg Project carefully talks about eligible paper books.
It has been left to the little known Tulane University to attempt to provide a solution to the copyright information problem. It is developing a web based service (Durationator) which will enable anyone to establish the copyright status of a book.
|Project Gutenberg Europe||https://pge.rastko.net/info/volunteer|
AT May 2010
Close of a Chapter in Book Publishing
"The situation is fraught with complications - "It's a mess. It's wonderful." So says Peter Olson, former CEO of Random House and now Senior Lecturer of Business Administration in the Strategy Unit, Harvard Business School.
He was describing the publishing business at present and its search for a valid business model for the future, during a Q & A session at the HBS.
In a recent unpublished case study, "The Random House Response to the Kindle," Olson and professor Bharat Anand have given students an introduction to the book publishing business and the emerging landscape of e-readers. Whilst the description of the publishing scene is fairly familiar to many people, the conclusions that have been drawn by HBS students are less so.
One idea was to bundle the e-book and print book together with, possibly, a video interview with the author. By utilising the potential of the electronics it may be possible to actually grow the total market.
The students also raised the concept of dynamic pricing, or charging varying amounts based on how many "extras" are packaged with an e-book.
The students do not appear to have provided any positive answer as to what to do about the projected reduced demand for printing.
AT April 2010
MicroSofts Open Doors Policy for Public Access Malware
Public computers, such as those in libraries, have daunting security and anti-malware threats and the Windows XP & Vista add-on, SteadyState, is often/usually one of their key defence tools.
In the early 2000s, the Bill and Melinda Gates Foundation funded a drive to put computers into schools and libraries, including those in the UK. Whether this was enlightened self-interest or purely altruistic is unimportant. It was obviously in the public interest. The Public Access Computer security software used for this scheme was subsequently developed by MicroSoft and eventually became the SteadyState program in 2007.
SteadyState functions by resetting the computer when a user logs off, thus protecting the users identity and data. It also allows administrators to restrict how users can interact with the computer for example, access to programs, Web sites, the Control Panel, and disk drives can be blocked.
As the program was developed by MicroSoft, it was aimed at Windows operating systems and has been an important part of public access computers running Windows XP & Vista. Now, MicroSoft has announced that there is no plan to develop a compatible version of Windows SteadyState for Windows 7 [the current version]
Most public libraries are still using Windows XP or Vista at present and are busily putting off the evil day when they have to upgrade. This would mean that they would have to spend large sums of money and there are currently more pressing needs. It is apparent that the cost of upgrading will be significantly greater than expected, due to the need to buy third party software to replace SteadyState.
The immense success in the UK of the public library Peoples Network and of coffee-house internet access is not unique. A Gates Foundation funded, University of Washington study, has reported that some 77 million Americans (about 30% of the population) used a library computer or public Wi-Fi network to access the Internet last year. This widespread public-access use is repeated around the world, in developing countries as well as advanced countries. The MicroSoft decision will thus eventually have serious implications for billions of people.
It is quite possible that it will also have serious consequences for MicroSoft. The use of the Windows operating system in schools and libraries is an enormously powerful marketing tool for MicroSoft. The company is now putting that in jeopardy.
University of Washington
AT April 2010
The Counter Attack
Apples competitors do not intend to allow its iPad tablet computer an undisputed position in the market place. Following Microsoft and Asus (see below), Toshiba appears to be working on at least two similar devices. One of the Toshiba tablets will use the Windows 7 operating system and the second Googles Android system.
A 10 inch screen is the most likely configuration for one Toshiba device and a dual screen one for the second. The company is thus backing both strands of e-book reader development.
Toshiba has stated that machines using the Windows 7 operating system will initially have a price premium. As Windows is a Microsoft proprietary product and Android is an open source one, this is to be expected.
Also believed to be developing tablet computers to take on the iPad is Google and mobile phone giant, Nokia. The Google device will use the Android operating system and the Nokia device will use Windows 7.
The open source Android and the widely used Windows system will give Apples competitors a significant advantage. Additionally, it is expected that all the competitors will undercut the iPad selling prices.
There is every likelihood that the tablet computer market will very quickly become a commodity one in the same way that the PC market eventually developed. As its main strength is its innovative technology, Apple does not usually try to dominate this type of market and it is normally content to become a minor, but profitable, player. The question is: has the iPad a sufficient technical advantage over its rivals to survive the coming battle?
E-reader devices, such as the Kindle, generally have limited processing power which restricts their overall performance. This results in their cost relative to function being high when compared to most multifunction devices. The iPad does offer some improvement in processing power, but it also costs more. The question is: does the iPad offer sufficient improvement to justify the premium?
The iPad LED backlit LCD screen has the advantage over the e-Ink displays used by its competitors in having a shorter refresh time. It is also a colour screen whereas its competitors only have monochrome ones at present. These are major advantages and a comparison of the iPad screen with an e-Ink one has now been made by zdnet.com to quantify that advantage.
The zdnet conclusion is: The iPad appears to be adequate for light daytime indoor reading, but fails miserably as an outdoor reading device. Vizplex e-Ink readers such as the Kindle, the SONY line and the Barnes & Noble Nook still appear to be much more optimal for long duration reading indoors and outdoors during daytime hours. As many people expect to be able to read e-books on the beach or in the garden as well as in the sitting room, this is a damning verdict.
AT April 2010
The iPad Backlash
The greater the hype, the
greater the disappointment. This seems to be the case for the iPad. There
is no doubt that the new device was subjected to considerable hype and, now
that the tablet computer has reached real people, its limitations are beginning
to show. Chief among the criticisms are:
Problem 1: The lack of Flash support means that internet sites using animation are effectively no-go areas. Many of the worlds most popular sites are in this category.
Problem 2: Some iPads will not stay on-line for more than 10 -15 minutes. Some of these can be persuaded to operate properly by rebooting, but not all.
Apple has described the iPad as a "magical and revolutionary product". The magic appears to be in the marketing and the revolution appears to be in the uncritical acceptance of that by a very large number of people.
No doubt Apple will produce a mark 2 device capable of working as people expect and may even provide a fix for existing units. The question is when? The competition is not idle. MicroSoft has a twin screen e-reader almost ready for launch (see below) and the Taiwanese manufacturer, Asus, is expected to release at least two tablet PCs in the next few months.
It is probable that the new iPad competitors will use the highly popular Windows 7 operating system. This will give them a considerable advantage over the iPad, as there are a huge number of proven application programs already on the market for this system.
By accident (possibly), MicroSoft seems to have confirmed that it will soon release a tablet computer called 'Courier'. The device would be marketed as a rival to Apple's iPad.
Rumours of the new Microsoft computer have been circulating since September, when leaked documents showed a design for a twin-screen device without a physical keyboard.
It is believed that the device would be capable of:
· Operating in a similar way to a printed book (with facing pages), when it is held in portrait mode.
· In landscape mode, it would operate with the top screen acting as the display and the bottom screen as a touch keyboard.
· Operating as a tablet computer when hinged back on itself.
AT Mar 2010
The e-book market upset caused by the launch of the iPad has caused much debate about what should be the price level for e-books (see below) and The New York Times has now added to that with an interesting breakdown of the competing price models in comparison with hardcover books. Its analysis is:
(Apple price model)
(Amazon price model)
Printing, storage, shipping
Design/digitizing, typesetting, editing
Publisher's profit, before overhead
The New York Times concluded, from its analysis, that customers have exaggerated the savings and have developed unrealistic expectations of how low the prices of e-books can go.
The figures produced in the article are probably close to the truth and the e-book / hardcover comparison is almost certainly justified considering the relatively small proportion of the overall market occupied by e-books. However, todays market is not representative of the future one. We have arrived at a Guttenberg type (pun not intended) discontinuity in publishing and no real attempt has been made to predict how this will change the industry. Indeed, the article could be misleading because of this.
As the e-book market develops, sales volumes will increase considerably. All commentators agree that e-books sales will become a significant proportion of the total. Most suggest that this proportion will amount to 20 30% very soon. Once this level is reached, the cost comparison with hardcover books begins to appear unjustified and the future would need to be predicted by comparison with paperbacks. The New York Times definition of the future appears to be restricted to the next 2 3 years and the claim that customers have exaggerated the potential e-book cost savings is going to be simply wrong in the long term.
Give in Order to Receive
For some time, there has been a persistent, low-level discussion going on about how the availability of free ebooks affects the sales of the print versions of these books. Some publishers and authors believe that making works freely available on the internet increases their visibility to a wide audience and, as a result, a noticeable increase in sales of the traditional version is obtained. Two researchers at the University of Michigan have now investigated what happened to the sales of 41 books after they became freely available over the internet.
The researchers, John Hilton III & David Wiley, have published a report on their investigation in the Journal of Electronic Publishing. They looked at the sales of the books for the eight weeks prior to them becoming freely available and also for eight weeks after by using Neilson BookScan data. This data tracks approximately 70% of all book sales in the USA. Very importantly, the recorded sales are sales to people rather than sales to shops.
The research has found that there is some truth in the belief that printed book sales are increased when the ebook is given away. It was also found that it is not a straightforward correlation though. The model used by science fiction publisher Tor to provide the free versions was found not to produce extra print sales and may actually have the reverse effect. Unlike other publishers, Tor requires registration and only gives free availability for one week.
The researchers pointed out that there may be other benefits of providing free ebooks, such as increasing the sales of other books by the same author and generally improved market awareness of the publisher.
The gradually increasing number of in-print ebooks becoming freely available, in addition to the enormous number of out-of-copyright ebooks already in this category, tends to undermine the position of public libraries as the prime supplier of literature to a large proportion of the population. Whilst professional librarians are becoming aware of the potential problem, they have not yet come up with an answer.
AT March 2010
The Internet Good for Literacy
In the Huffington Post of 19th February there was a very perceptive blog. The basic premise of it was that the internet had increased literacy rather than decreased it, as some education pundits allege.
Until very recently, watching television was the preferred occupation of teenagers during their leisure time. This required little intellectual effort and near zero literacy. Literacy is an acquired skill and needs continual reinforcement. In the past, this reinforcement was provided by books and newspapers. However, the dominance of television has removed these influences and young adults have been exhibiting progressively lower levels of literacy.
In the face of the falling levels of literacy in the country, educationists had no solution and simply blamed others. The public libraries, the traditional education safety net, had given up any serious attempt at tackling illiteracy and were busily running down their book stocks. The young were, in fact, largely left to their own devices.
Miraculously, it seems that the young may have found their own salvation. Unlike television, the internet does require basic literacy even for social networking and some areas of it demand much more.
If the hypothesis is correct, we can expect an unedifying spectacle of politicians, educators and librarians each claiming to be responsible for something them had no part in.
AT March 2010
Ebook Pricing and the iPad
The launch of the Apple iPad has put in jeopardy Amazons plans of becoming the permanent dominant retailing outlet for ebooks. The new Apple product has been delivered to almost nobody and it is already changing the scene.
It is not that the iPad is obviously a blockbuster product in the same league as the iPhone. In fact it has been described as simply a reworked tablet computer. The marketing strength of the iPad is the Apple name and the App Store.
People remember the groundbreaking products from Apple and forget the dismal failures. Because Apple is willing to bet on technical innovation, it is inevitable that it sometimes comes unstuck. So, perhaps it deserves its near stratospheric reputation to some extent. However, that should not cloud the judgement of potential purchasers of its products. Where there is a clear technical superiority, the usual very high cost of Apple products can be justified, for those with deep pockets. For the products, such as the iPad, where there has been a lower level of innovation this is much more difficult.
Whether to buy the iPad should rest on value for money. The extremely large premium which Apple is accustomed to demand for its products is not so marked for the iPad. These Apple price levels suggest that it is not confident that it has another of those blockbuster products. There is a premium, as always, but there is also a somewhat better product. The US prices are:
Flash Memory Size
WiFi + 3G
The UK prices have not been announced, but the probability is that there will be a straight 1:1, $:£ conversion. This will make it fairly expensive but, for many people, affordable. It will also allow the competition plenty of room for manoeuvre. It is how the competition reacts that will determine the success of the iPad. If the competitors react at the same glacial pace as Amazon has with its Kindle, Apple will be able to establish a solid customer base and dictate the way the market develops. The competitors will do their best to frustrate this. As nVidia, the graphics card manufacturer, exhibited a tablet pc reference design very similar to the iPad last year, similar products can be expected in the very near future. The game has only just started.
The most obvious superiority of the iPad is in the display. The size of the display (9.7 inch diagonal) is no bigger than several competitors. However, its resolution (1024 x 768 pixel) is similar to that usually used for much larger monitors (17 inch & 19 inch) and is a little better than that of the competitors.
The iPad operating time on its battery is also good at 10 hours and it is considerably more versatile than most of them. This versatility lends credence to the theory that the iPad is derived from a straightforward computer.
With the arrival of the iPad, the Amazon eBook pricing structure has come under threat. Amazon has been trying to force publishing houses to accept a universal retail price of $9.99 for eBooks. The publishing houses did not like this and several resisted. The publishers said that they could not make money at this level and really wanted a price close to that of a hardback book. As the cost of producing an eBook should be extremely small, much less than that of a paperback (some experts suggest that a $5 price would be appropriate), this seems to be an optimistic expectation.
At present, there is no doubt that the Amazon Kindle has the major share of the US market and, because of the size of this market, Amazon has been in a position to dictate to English language publishers how ebooks should be sold. To ensure that it had this power, Amazon designed the Kindle to operate only with its proprietary ebook format. For instance, there is no ePub support (ePub is the industry standard for ebooks). So, if you buy a Kindle, you have to buy ebooks from Amazon. This is not a great problem as long as Amazon does not exploit its monopoly position. However, large US corporations have a very bad record in this respect and some publishers have been suggesting that exploitation is already happening.
It is certain that Amazon is using its muscle to keep the price of ebooks down. It wishes the ebook market to develop quickly, so that its long-run profits are maximised. It has always treated books as a commodity and has been uninterested in literary mystic. It is in business to make money, not polish egos. Although publishers would claim the same outlook, many of them prefer to put hard commercial considerations some way down the priority list.
Those in local government who wish to reduce the number of public libraries in a borough often allege that the cost of books is very low and that there is not any need to have a free supply of them available. Inevitably, these individuals are earning a good salary and the cost of books does indeed appear low to them. Their attitude would be completely different, if they had to survive on a state pension or the minimum wage. For the people at the bottom of the heap, books are extremely expensive. There are many more people at the bottom of the heap than at the top.
Thus, whether books are considered expensive or cheap is really a matter of the point of view and this is exactly the situation with respect to ebook pricing. Amazon believes one thing and the publishers believe the opposite. Until the launch of the iPad, Amazon had the whip hand and relatively low prices were the order of the day. Apple has a similar marketing philosophy to the more conservative publishing houses low volume, high price. It is not, therefore, surprising that they should quickly come to an arrangement about ebook pricing. Instead of the universal $9.99 ebook, we can now expect to see $14 - $15 ebooks.
It looks as though Amazon have shot itself in the foot. It set out to obtain a monopoly position by limiting the file formats that would work with its pioneering Kindle. It did obtain that position but failed to consolidate it by not following through with blanket market coverage. Many small, innovative manufacturers plus Sony were able to open up the market and prepare the ground for the iPad.
Amazon is not a natural
manufacturer. It is a very successful retailer and it is to be expected that
it will fairly quickly return to that which it knows best. The various versions
of the Kindle will be allowed to die or, perhaps, the operation will be sold
off to an Asian manufacturer. This would leave Amazon free to concentrate
on the high volume part of the new business.
So, can we expect high ebook pricing from now on? Well, some publishers hope so, but there are also some who believe in the Amazon pricing model, though not the Amazon monopoly. The UK has now acquired a new ebook outlet from North America, Kobo Books. This has the backing of some of the UKs major publishing houses, including Random House UK, Penguin Group UK, Bloomsbury, Simon & Schuster UK and Faber & Faber and it is to market ebooks which can be read on any platform (Kobo applications are available on the App Store, on the Android Market, on BlackBerry App World and on the webOS App Catalog). With suggested ebook prices starting at £8.99, the operation is a direct challenge to Amazons UK market dominance.
AT March 2010
An analysis of approximately 100,000 full ebook downloads from the Smashwords website for the period January 1 - 31, 2010 has provided an insight into the relative popularity of the various ebook formats. The results make interesting reading:
E Book Format
Percentage of Downloads
It is somewhat surprising that the venerable pdf format should top the popularity list by a substantial margin. It has decisively outperformed the more modern, designed-for-purpose ebook formats.
Whilst the pdf format lacks the reader flexibility of its purpose designed competitors, it does provide many good layout options for generating the ebook in the first place. Further, it is familiar to most computer users. So there may be an element of the devil you know in the selection of the download format.
Whatever the reason for the popularity of the pdf ebook format, it is too soon for Adobe to congratulate itself. The game has just started.
AT Feb 2010
Ultra High Speed Broadband from Google
At least five US cities have said they are interested in taking up an offer from Google to build broadband networks operating at 1 Gigabit per second (1Gbps). These cities are Winston, North Carolina; Peachtree City, Georgia; San Francisco, California; Greensboro, North Carolina and Ontario County, New York.
The firm says it will help fund networks serving between 50,000 and 500,000 customers.
Whilst Google may be able to deliver 1Gbps to a building, it is doubtful that the buildings distribution network would be able to run at this rate without considerable modification. For instance, in many homes the distribution network is a wireless one and this is almost certainly limited to 56Mbps. Many businesses have networks limited to 100Mbps. This does not mean that the Google proposals would be of no benefit in these cases, just that the benefit will be restricted to that obtained by the removal of the bottlenecks outside the final destination. The overall system would run more efficiently.
There is much speculation about Googles motive for offering to partly fund this type of network. The suggestions range from pure altruism to a desire to increase its control on what happens on the internet. If the Google online book project is a reliable guide, there is a mixture of motives, both good and bad. The trick for the city Officers to pull-off is to maximise the good and minimise the bad.
I have said several times before that there is a need to upgrade the London telecom network to cater for the 2012 Olympics. This may be one way to achieve that and, at the same time, ease BT's strangle hold on the overall system.
As far as the legal maneuverings in connection with Google's on-line book plans are concerned, the long awaited judgement of US District Court on 18th Feb was more of a statement of indecision. Judge Denny Chin said that "I'm not going to rule today at this hearing" - "There is too much to digest". He did not specify when he would do so.
AT Feb 2010
BL + BBC = Big
The British Library has been digitising its stock for some time in collaboration with Microsoft and has now started discussions with the BBC with the intention of developing a common approach to technical issues, digital rights management and the distribution of content.
These two organisations have enormous archives (150m items in the BL & nearly 1m hours of TV and radio output at the BBC), with many historic items of incalculable value, and having common digital archiving standards will ease storage & retrieval of images of these. Researchers must benefit from this initiative.
AT Feb 2010
Microsoft's Windows 7 operating system was released towards the end of October and has proved quite successful. It has certainly proved far more popular than the previous Window version, Vista. Nevertheless, the rumour mill is now suggesting that the next version (Windows 8?) will be released for manufacturing in July 2011.
From a marketing viewpoint, the two year gap between the release of Windows 7 and its successor seems too short, unless Microsoft was not expecting Windows 7 to be successful. As the present version has been favourably received, it is to be expected that the company will let the release date for Windows 8 slip.
One reason for Microsoft not allowing the release of Windows 8 to slip very far is the knock-on effect for the release of its next but one version of the Office suite. Rumours suggest that this is now due in July 2012. This product is the subject of quite intense competition and Microsoft needs to keep innovating to stay ahead.
AT Feb 2010
Libraries React to Googles Plans
Reuters has reported that The American Library Association, the Association of College and Research Libraries and the Association of Research Libraries have expressed concern that Google's huge book digitisation project would have a near monopoly. The American Authors Guild, US publishers and many non-US governments have forced considerable changes to the original Google scheme. Now, library organisations seem to have changed their previously supportive attitude to one of mild criticism. The three associations have asked the US government to urge the court to use its oversight authority to prevent abusive pricing in the online book project.
The non-US country which will be most affected by Googles new enterprise is the UK. However, it is the UK which has had the least to say about it. It is understandable that big UK publishers have allowed their US counterparts to protect their interests, as they are often the same organisation. Small UK publishers and the majority of UK authors are not necessarily well served by this. Whether UK public libraries will benefit or be penalised by this US development is not known and there does not appear to be any DCMS interest in finding out. The traditional passivity of the DCMS may be a cause of deep regret in the future.
AT Dec 09
German/French Digital Library
On 2nd December 2009, in a response to Googles Book Search program, the German government agreed to provide an initial 5 million euros and an annual 2.6 million euros to fund a German Digital Library.
It has been decided that the German Digital Library will request copyright holders permission before digitising a work. This is completely different from Googles approach. The US company first digitises works and then allows copyright holders to have their works deleted from the data base, if they so wish. The Google method of operation has the advantage that the many orphan works (works without traceable copyright holders) in existence can be digitised and supplied to the public. Thus, the public benefits from a wider choice, Google makes a profit and, if the copyright holder later appears, he/she will also make a profit. Thus, in theory, everyone gains. However, many individuals and organisations are worried about the possible differences between theory and practice. Google seems to be seeking a monopoly which may eventually act against the public interest, even if it is initially beneficial.
A few days after the German announcement, Reuters reported the French President, Nicolas Sarkozy, as saying that he would not let his country's literary heritage be taken away by a "friendly" large American company, i.e. Google. The president said digitisation of books would be one of the projects financed by a planned national loan, which is due to pump billions of euros into strategic investments in 2010
One of the concessions which Google has made to placate its critics is to restrict the scope of its Book project to the major English speaking nations (see index.htm#concess ). Therefore, the German & French government initiatives will partially fill the gap left by the Google withdrawal.
The results of the German digitalisation project will be linked to the European Unions Europeana project which went live in 2008. Unfortunately, so far, there has been a lack of content available from this site.
Of course, the UK is a European nation. So one would expect that it would match the efforts other major European nations, such as France and Germany, in promoting its culture. This seems unlikely to happen. The UK government will probably be content to allow its culture to be submerged under the dominant US one. It is so easy to do nothing and simply drift at a national level, leaving individual institutions to attempt to redress the balance from their own funds.
The French government has also urged the European Union to agree on a massive digitisation project, effectively competing with Google. If other European governments follow the Franco-German example and the EU provides the backup, European culture will become easily accessible from anywhere in the world. However, it is almost inevitable that the US world domination of this type of activity will be mirrored in Europe in the form of a Franco-German domination. The smaller European countries will be squeezed out. Polish & Czech literary works will be hard to find and Slovene written culture impossible to discover.
AT Dec 09
European Opposition to the Google Book Settlement
Opposition to the Google Book Settlement is growing in Europe with France, Germany, Austria, Switzerland and Spain having severe doubts about the benefits of the scheme. The US Federal court which has been tasked with deciding whether the settlement is in the public interest has set a Sept. 4 deadline for submissions and plans to make a final decision on Oct. 7. Although some major European organisations are opposed to the settlement, many major publishers support it. Approximately 25,000 of the worlds publishers, libraries and individuals have declared their support by allowing Google to digitise their archives and catalogues.
Although there is considerable opposition to the settlement in Europe, it is not clear whether this outweighs the support for the scheme there. For example, the German Government opposes the settlement (it believes that it is illegal in Europe) but the Bavarian State Library is actively collaborating with Google in its digitalisation crusade. In France there is much opposition but there appears to be an acceptance that there is little that can be done by non-US authorities.
A legitimate criticism is that the settlement gives little European representation on the Book Rights Registry, a panel that is supposed to collect and distribute revenue from Googles U.S. book sales to authors and publishers. Sales of their books in the US are important to European authors and publishers.
In the UK, there is little opposition. No doubt, the close US connections of the major publishing houses and the possibility of generating significant income from out-of-print books have influenced this attitude.
At present, the expected decision from the court is a qualified approval. How the court views the potential for creating a monopoly will determine how draconian the qualifications are. That such a potential does exist is beyond doubt and needs to be dealt with. This can be done without outlawing the settlement as is suggested by one of the leading US opponents, Microsoft the notorious monopolist.
AT Aug 09
In an effort to counteract the critism of the Google Book Settlement from non-US organisations, Google has proposed that two non-US representatives should sit on the governing board of the registry which will administer the settlement. Google has sent a letter to 16 European Union publishers' representatives outlining such a solution. Thus, 25% of the governing board members would represent the vast majority of the world's authors, publishers and readers. The US-centric nature of Google's planning is clear but unsurprising, given that the basic settlement is totally concerned with US interests.
In a further concession, Google has proposed that books which are still on sale in Europe but which are no longer available to consumers in the United States will now be omitted from the database. Google is uncertain how many titles this would affect.
Whilst these concessions may be just enough to satisfy the publishing houses, they are unlikely to satisfy those who believe that the settlement is unlawful within EU countries.
Early next year, Amazons large screen e-reader will get some competition. That is when Plastic Logic plans to launch its 8.5 X 11 inch offering. It will weigh less than most magazines and will be Wi-Fi and 3G enabled. Plastic Logic claims that it will support PDF, Word, PowerPoint and Excel documents. So, it is more versatile than most of the other e-readers on the market and will probably sell for a premium price.
It will be interesting to see whether Plastic Logic concentrates solely on the lucrative US market, as Amazon has done, or whether it attempts to match the Sony world-wide reach. If Amazon continues to delay its promised entry into the European e-reader market (see below), the temptation would be irresistible for Plastic Logic to do that quickly.
AT Aug 09
The Kindle is Imminent (probably)
Amazon is rumoured to be
about to announce a UK launch date for its Kindle ebook reader.
It is believed to have outsourced Kindle operations in the UK to Qualcomm. This would include securing connectivity with a UK mobile operator.
Amazon wishes to replicate what is provided for US customers and to allow the download of books through PCs or over Wi-Fi connections. Also, as in the US, it wants to provide the option of a regular download of newspapers, magazines and journals while on the move, via a mobile network. The online retailer is known to have discussed a potential deal with Vodafone, Orange and 3 approximately six months ago, but discussions broke down. One of the probable reasons for the lack of an agreement at that time is that Vodaphone and Orange are pursuing their own ebook reader development programs.
An Amazon spokesman said: We have previously announced that we are looking to release the Kindle with our international customers. At the moment we have no timeline. In spite of this, the rumour mill is saying that Amazon is working very hard to launch the Kindle in the UK for the crucial Christmas 2009 trading period. Agreements with book publishers have already been secured and it is in negotiation with news and magazine publishers.
Recent calculations in the US have shown that the New York Times could save a large amount of money by simply making a gift of a Kindle to each of its subscribers and then supplying the newspaper via the internet. The removal of printing and distribution costs from UK newspapers and magazines would probably look just as attractive.
For Amazon to meet the Christmas 2009 deadline, an announcement has to occur soon. It cannot afford to miss the deadline, as Sony and others are adding to their UK customer bases with every day that passes (see article below). It has probably allowed itself to drift into a position where it needs to attempt a marketing knock-out blow in the UK.
AT July 09
Has Amazon Left it Too Late?
Amazon is a formidable retailer and is quite capable of dominating the market for eBook readers. However, it has yet to introduce its Kindle into the UK or European markets and has, therefore, allowed Sony to build up a significant bridgehead in these very important areas.
In the UK, Sony has obtained its market penetration by allying itself with another powerful book retailer Waterstones. Now, a second mid-market eBook reader manufacturer has signed a distribution deal with a book retailing chain. Borders is to sell the Elonex eBook reader for £189. This undercuts the cost of the Sony reader by £10 at least, it does until Sony/Waterstones respond.
Unlike the other eBook readers on the market, the Elonex version has very little to offer, beyond the basic reader use. However, that is what an eBook reader is all about. It has a weight of only 150g (5.3oz). So it is quite light compared to the Sony offering and this will be very attractive to commuters, who often have to hold their (e)book with one hand while strap-hanging with the other.
The Sony and Elonex screen sizes are identical, but the Elonex battery allows 8000 page turns between charges compared with the Sonys 7000 page turns. Further, the Elonex Readers internal memory can store 1000 books, whereas the basic Sony can only store 160 books. The overall volumes of the two devices are similar, with the Elonex one being slightly longer and thicker, but narrower.
When Amazon gets around to entering the UK market for eBook readers, it will find at least two well entrenched suppliers there already. It will, therefore be necessary for it to buy its way in with a low price or to offer something technically much better than the incumbents. Of course, Sony et al know that Amazon is coming, so they will be ready with their upgraded counter offering. Interesting times.
AT July 09
ISBN Numbers to Change
When the public is enthusiastic about a product which has an identification number allocated to each item, it is periodically necessary to increase the range of numbers available. This happens quite regularly with telephone numbers and, a little less frequently with books.
The ISBN migrated from
10 to 13 digits in January 2007 in order to increase capacity. Since then,
the book identification number has started with the digits 978. Now the International
ISBN Agency has announced that a migration to a 979 prefix has started. Each
country will change when it runs out of available 978 numbers France
has already done so.
AT June 09
Random House Benefits From Free Downloading
In March, Random House announced that it would release five, free books through its science fiction portal, all of which came in downloadable PDF files (among other formats). John Hilton, a doctoral student at Brigham Young University, recorded the before and after book sales and found that one of the five books had zero sales in 2009 i.e. there were no sales before or after the free version was made available. However, the other four books saw significant sales increases, after the free versions were released. Overall, the combined sales of the five books were improved by 11% - they sold 4,633 copies during the 8 weeks prior to being released free and 5,155 copies during the eight weeks after being released.
The other science fiction publisher which is making its books available for free downloading (one per week) is Tor .
AT May 09
The New Kindle E-reader
Amazon has launched a new e-book reader only a few months after that of its Kindle2. The new reader has a number of enhancements, but has also a large price increase. A headline comparison of the two offerings is:
|Kindle DX||Kindle 2|
|Display||9.7" diagonal e-ink||6" diagonal e-ink|
|Size||10.4" x 7.2" x 0.38" (264 x 183 x 9.7 mm)||8" x 5.3" x 0.36" (203 x 135 x 9.1 mm)|
|Storage||3,500 books||1,500 books|
|PDF Support||native PDF reader||Support via conversion|
|Price||$489.00 (£323/€367).||$359.00 (£237//€269)|
Improved Book Scanning
We have all been disappointed with the result when we have attempted to scan a book page. What we find is that, because the binding of the book causes the page to be arched, the image is distorted.
We may be a little disappointed, but for the Google book scanning exercise the problem is a near disaster. Now Google has found a solution.
Google now projects an infrared pattern onto the page. By using two infrared cameras, Google can determine the patterns distortion caused by the two dimensional page turning into a three dimensional object. From this it is possible to correct the image to reproduce the page as it should be.
AT May 09
Amazon.com Inc.'s new Kindle 2 eBook reader costs $185.49 to produce. The direct material cost of the Kindle 2, consisting of all parts used to make the product, amounts to $176.83 (the most costly component is the Eink display at $60). Adding the conversion costs (manufacturing expenses and the battery) increases the total by $8.66 to $185.49. Marketing and distribution costs must then be added. However, it still appears that Amazon is making a very large profit from the selling price of $359. No doubt the profit would have been larger, if Sony had not also been marketing an e-reader and will reduce when competition really intensifies.
Full breakdown: https://www.isuppli.com/NewsDetail.aspx?ID=20138
AT April 09
Print on Demand Has Arrived
The first UK bookshop installation of the Espresso Book Machine will be open for business on 27th April at Blackwells Charing Cross store. This print on demand machine will allow a book to be printed in under 5 minutes. The book may be selected from a large library of titles, or uploaded in person from CDs or flash drives. It is also possible to use the new machine online at Blackwells website.
The public now has an opportunity to obtain many out-of-print/difficult-to-obtain books and to publish its own books. Blackwell effectively increases its stock, without needing more space, and makes greater profits (hopefully).
From the publics viewpoint, the viability of the new enterprise is critically dependant on the quality of the books produced by the machine. It is claimed that the books are of library paperback quality. This could mean something that will last two years and then fall to pieces. Time will tell.
If the books produced by the Espresso Book Machines really are of the same quality as those purchased by libraries, the new facility offers libraries a new source of stock. Whilst the libraries most likely to make use of this are the academic and specialist libraries, public libraries cannot ignore it.
AT April 09
Google / Authors Guild Agreement Challenged
Google has been scanning out-of-print, but in-copyright, books since 2004 and has reached a historic agreement with American authors and publishers to make these available to the public. As the books have been abandoned by authors and publishers, this is a very good deal for both of these groups. Possibly, it is also a good deal for the public. Whether there is more than a marginal gain for the public is still an open question, as the details of the agreement will determine this.
Copyright specialists, antitrust scholars, and some librarians suggest that rights to orphan works should not be restricted to Google and oppose what they allege is a private rewriting of the copyright rules for millions of texts legislation is being circumvented. Their main fear is that Google would abuse the monopoly position which the agreement would give it.
The American Library Association, the Institute for Information Law and Policy at New York Law School, and a group of lawyers coordinated by Harvard Law School intend to raise concerns when the agreement is considered by a Federal court.
One of the worries of the librarians is that orphan works could make up the bulk of some library collections.
The settlement, which is restricted to books protected by copyright in the United States, allows Google to show readers in America up to 20% of most copyrighted books, and will sell access to the entire collection to universities and other institutions. American public libraries will get free access to the full texts for their patrons at one computer and individuals will be able to buy online access to particular books.
Proceeds from the program will be divided between Google (37%) and authors & publishers (63%). Google will also help set up, i.e. pay for, a Book Rights Registry, run by authors and publishers, to administer rights and distribute payments.
Authors are permitted to opt out of the settlement or remove individual books from Google's database. Thus, authors who are still alive are likely to remove many books from the database.
The Registry's agreement with Google is not completely exclusive, as the Registry will be allowed to license to other organisations the books of those authors and publishers who have explicitly authorized it to do so. Since no such authorization is possible from dead authors, only Google would have access to the works of these. Therefore, only Google could assemble a truly comprehensive book database.
AT April 09
The Ereader Battle Heats-Up
Sony has reacted to recent moves by Amazon to corner the eBook market. The Japanese company has signed an agreement with Google to make 500,000 books available for the Sony eReader. These books were all written before 1923 and are all out of copyright. The collection can be expected to include most of the classics of English literature and should pass the Jane Austen test without difficulty.
The Sony agreement covers only a fraction of the seven million books scanned by Google since 2004. Unsurprisingly, Sony has expressed a wish to continue working with Google.
Amazon commented that its eBook collection of 250,00 titles includes books in copyright. The strong Amazon position in the book distribution marketplace, gives it a competitive advantage in this respect. However, the recent Google/Authors Guild agreement would erode this significantly, if it gains Federal approval and if Google continues to work with Amazons competitors..
AT April 09
US E-Book Sales Take Off
The International Digital Publishing Forum / Association of American Publishers data on quarterly US trade retail e-Book sales showed a steep rise during 2008. Not included in the figures are sales for libraries and educational establishments.
AT Mar 09
We have just got used to the improved performance of USB 2. So now, we have to expect a further enhancement to keep the technology pot boiling. More or less on schedule, USB 3 will start to appear in consumer products in the middle of this year and will be widely deployed in 2010. For some applications, such as printer interfaces, the increased speed of USB 3 will make no noticeable difference. However, for many uses, the x10 speed increase is vital to continue the growing sophistication / complexity of new applications.
The USB 3 specification calls for a bit rate of 5Gbps. However, in practice, this may fall to something like 3Gbps - still a major improvement over USB 2.
AT Feb 09
The Kindle 2 & Google Mobile BookSearch are Launched
The Amazon.com founder and CEO, Jeffrey P. Bezos, has unveiled the new Kindle e-book reader, the Kindle 2. This has a new "Text-to-Speech" facility that converts words on a page to spoken word so consumers have the option to read or listen. A good idea, if you want to hear all books read with an American accent. Howevr, the US Authors Guild claims that this facility infringes copyright laws.
Of more general use is the increase in memory size of the new Kindle. It now has 2GB and can store 1500 books compared with the 200 books for Kindle 1.
The battery life has also been increased by about 25 percent. The extended battery life will allow Kindle 2 users to read for four to five days on a single charge, if the device's wireless service is turned on. With wireless off, this rises to two weeks.
The Kindle 2 is significantly thinner than its predecessor at 0.36 inches compared to 0.7 inches and it is very slightly lighter (10.2 ounces, 289g). The electronic ink display has also been improved, changing a 4-shade to 16-shade grayscale screen.
The original Amazon offering, Kindle 1, has only been available in the USA but, despite this limitation, analysts estimated that Amazon sold 500,000 during 2008 and is expected to generate $US1.2 billion (£800 million) in e-book-related sales this year.
At present, the Kindle Store offers customers over 230,000 titles and numerous newspaper subscriptions (the NY Times has 10,000 subscribers paying almost $14 a month to read the paper on the Kindle). The average Kindle user buys 1.6 e-books for every ink-on-paper book he/she purchases.
As expected, there has been no price change. The Kindle 2 costs an impressive $359. Not exactly a notably good buy during a recession.
In the same week as the Kindle 2 launch, Google announced that it plans to provide competition for ebook readers such as Amazon's Kindle and Sony's eReader with its launch of a mobile version of the Google Book Search service. Users wishing to read any of the 1.5m books currently available on the service can now do so by visiting a mobile friendly version of the site that has been optimised for all mobile phones, but in particular the Apple iPhone and the Google backed G1 Android handset from T-Mobile.
The Google Book Search previews are page images made by digitizing physical copies of books. These page images work well when viewed from a computer, but prove unwieldy when viewed on a phone's small screen. The system uses OCR (Optical Character Recognition) software to scan pages so that they flow in the mobile browser just like any web page. Google has warned that the free books might not be completely perfect:
Google has been working on the mechanisms to make its Book Settlement with the Authors Guild become a practical reality and to significantly expand the number of books available on the internet. The settlement requires Google to provide approximately $30 million to set up a Book Rights Registry (BRR). This will be an independent agency that will function as a Copyright Clearance Center, to monitor and dispense rights and payments.
AT Feb 09
Zero Cost Self-Publishing
Everyone is supposed to have at least one book in them. Thankfully, not everyone wants to get that book published. However, there are a huge number of people who do and these have a great uphill struggle to achieve their ambition. Self-publishing is one answer to this that involves a low/medium cost.
Now there is a version of self-publishing which is alleged to do away with cost entirely. Unsurprisingly, this version uses the internet to provide a platform for would-be authors. The publicity for a new website says:
With myebook.com, we've made it possible for anyone to upload or create from scratch beautifully simple or adventurously complex page designs and covers online, in no time. What's more, users can publish their book with a single button and release it to the world before the (virtual) ink's dry! They can create as many publications as they want. And it's all freeIt's the ultimate ebook platform for any personal or corporate publications, whether it's for novels, childrens' books, magazines, comics, photo albums, leaflets, brochures, instruction manuals. You can even embed or link to videos, audio, documents, images and flash files to make your books fully interactive.
There seems to be only one problem left to solve, apart from writing the opus. How do you get people to read it?
AT Feb 09
E-Books in Libraries
A recent survey in the UK conducted by NetLibrary; OCLCs Ohio based eContent division, has found that three-quarters of academic libraries and half of public libraries that responded intend to increase their collections of eBooks over the next year, in spite of the current difficult financial climate.
85% of public Libraries responding to the survey indicated that they were most interested in developing fiction eBook collections despite recent research that suggests eBooks are most often used for reference purposes. Possibly this trend is being fuelled by the growth in take up and availability of eBook reading devices among public library users such as Amazons Kindle and Sonys Reader. Similarly a rise in the usage of MP3 players could be attributed to the fact that 65% of public libraries also indicated an intention to further develop their eAudiobook collections:
Half of the responding academic libraries said that their use of eBooks was to support their core reading lists, the major areas being Business / Management (13%), Medicine / Health (9%), Education (6%) and Engineering (5%).
At Jan 09
The Kindle Yet to Come
Whilst the Amazon Kindle ebook reader has been available in the US for some time, it has not yet been put on sale in the UK. This is no great hardship for us, as there are other offerings to which we can turn. In the meantime, some of the deficiencies of the present Kindle design can be addressed. Kindle 2 is now ready for its launch.
Rumour suggests that the new design will replace the dedicated battery charger with charging from the USB connection. Nobody is expecting a noticeable price reduction something that is necessary if ebooks are to become widely used.
Coincidently (perhaps), it looks as though Amazon has declared an ebook war. It has notified its publisher and author clients that it plans to cease offering e-books in the Microsoft Reader and Adobe e-book formats. It will concentrate on ebook offerings using its own Mobibook format.
AT Jan 09
A pilot scheme to collate in-depth information on books borrowed has been run by Nielsen Book covering 156 libraries from eight UK public library services. The service, LibScan, has been claimed to be outstandingly successful and Nielsen is planning to launch the service nationally in the summer of 2009. It will cover all of the library services in the UK. The full service, free to participating libraries, will provide accurate borrowing figures which can be used by libraries to improve their stock selection, promotion and optimise book budgets.
Proven analysis technology will be used to collect and measure public library book borrowing statistics. The data will be collected weekly and regular 4-weekly reports will be made available to the participating library services via an online analysis system.
How attractive the proposed analysis service will be to a library authority depends very much on how much extra cost is involved in extracting the raw data from the library management system in a form suitable for the analysis. Of course, library managers should have been carrying out such an analysis already. So, perhaps there will not be any extra cost.
USB Takes a Step Forward
The specification for USB3 has been finalised. This upgrade of the serial interface method will take the speed up to 4.8Gb/s and allow the download of a 25GB HD film in 1min 10sec. It is expected that devices, with USB3 incorporated, will become available in 2010.
USB3 will, of course, be backward compatible with the current version, USB2.
AT Dec 08
The One Laptop per Child arrives in Europe
The One Laptop per Child Foundation has made its XO laptop available to European customers. Its G1G1 (give one, get one) scheme will allow European customers to buy a XO laptop and simultaneously donate one to schools in developing countries.
The low spec XO laptop will not be very attractive to European adults and will be totally unacceptable to fashion-conscious European teenagers. However, its rugged design and bright colour should make it suitable as a first computer for primary school children. A netbook would probably be a better buy though. Public libraries will find it unusable.
Dell/HP/Mesh/Evesham et al will not have great worries about losing market share to the new competitor in Europe, especially as the cost to UK buyers (from Amazon) will be £275 plus £50 delivery. The outrageous delivery cost probably kills any hope of a high sales volume. The £50 delivery cost should be compared with the zero delivery cost of any Amazon purchase costing over £5.
Time is of the Essence
There has been a degree of scepticism about the challenges that will face public libraries in the medium term future. In particular, the affects of e-book readers on the publics reading habits are often downplayed (as Nick Hornsby has done recently). This disbelief is usually based on the usefulness and likely market penetration of the e-readers that are currently available.
The first generation of any new appliance is inevitably somewhat clumsy and even difficult to use. There is an element of this in the existing crop of e-book readers, but they are good enough to show the potential of what is to come.
There is definite promise shown by the presently available e-book readers, although they suffer from high cost and restricted content. It is usual for time to correct both of these problems and this correction will, in turn, determine the way that public libraries are affected.
John Hughes (Head of Libraries in Lewisham) suggestion seems to be coming to pass, that there will probably be enough books freely available from the internet to soon justify at least a limited investment by Library Services in e-book readers.
The Gutenberg Project has undoubtedly been the trailblazer in making out-of-copyright books freely available on the internet and it is still adding to its list ( Main2.htm#books ). Encouragingly, the willingness of book-trade stalwarts, such as Bloomsbury and Harper Collins (index.htm#bloom & index.htm#harper ), to join the internet publishing trend by making some of their in-copyright books freely available on the internet adds another valuable dimension.
Of course, the very big fish in this pond is Google and the recent agreement between it and the Authors Guild on the scanning of in-copyright books without permission could provide a possible base for the development of a large e-book lending service, in addition to commercial sales. The deal, which is still subject to approval by a federal court in Manhattan, was made after three years of negotiations between Google, American publishers and American authors and, according to the British Booksellers Association, is expected to be exported to Britain and Europe.
As book reading and internet use are showing signs of merging into one activity, an interesting piece of research has been published in the US about the affect of each activity on the brain ( Main2.htm#brain ). They are both good for you, but the internet version is better.
All these developments are beginning to create the situation that I predicted four years ago ( Main2.htm#future ). It was not difficult to see what was going to happen, but the MLA and LLDA, which should have been planning for it, have been very slow to do anything noticable. Time is beginning to run out and time is of the essence.
AT Nov 08
The Internet for Young & Old
The government has announced that it will run a pilot scheme to assess the benefits of providing free laptop computers and free internet access to underprivileged children. This seems to be an admirable initiative and the only criticism that can be levelled at it is that children in London will have to wait for the outcome (inevitably, positive) before they are offered anything.
The new scheme seems to acknowledge the limitations of the Peoples Network. This popular, public library based scheme has succeeded in spreading internet usage to poorer sections of society, but suffers from the limit placed on maximum usage per day. Thus, it is only suitable for the more superficial tasks. If you wish to carry out intensive tasks, such as difficult homework assignments / protracted research, you are on your own.
Nicholas Negroponte from MIT has, for several years, been working to implement a similar scheme to that proposed for the UK, but on a much grander scale. His target is all the worlds really poor children (see: Progress Towards the $100 Laptop below). It would be rather nice if the UK could assist the bigger scheme by using some of the technology that has been developed for it.
Virtually simultaneously with the governments announcement of its childrens laptop initiative came news of research into the effect of internet usage on people at the other end of the age spectrum. From a slow start, the elderly section of the population has been catching up with the rest of society in its use of the internet (see: Internet Demography , UKs Internet Market Matures , The Greying of the Internet below). This trend was considered to be a good thing, as it helped to keep old people in touch with friends & relatives and allowed them to pursue their interests, in spite of declining mobility. Now, it has been found that internet use by the elderly can have a direct beneficial health effect.
American researchers have carried out brain scans on 24 middle aged & elderly volunteers (ages 55 76) with healthy brain function and these showed that going online stimulated larger parts of the brain than the relatively passive activity of reading a book. Variables such as age, educational level and gender were carefully eliminated from the research findings.
The project results did not contradict the belief that book reading stimulates the brain. Indeed, all participants showed significant brain activity during the book-reading task. The regions controlling language, reading, memory and visual abilities, which are located in the temporal, parietal, occipital and other areas of the brain were those shown to be stimulated by book reading. For the internet using group, there was also activity registered in the frontal, temporal and cingulate areas of the brain. These control decision-making and complex reasoning.
With the aging UK population and the increasing concern over Altzheimers disease (reference: https://uk.reuters.com/article/topNews/idUKTRE4AP4UP20081126?rpc=401& ), this study is obviously very relevant. The researchers believe that the study shows a method of helping people maintain healthier brains into their old age combating age-related general brain wastage and reduced cell activity. So, old people should be encouraged to use the internet more. This is a very simple solution to a difficult problem - unless the old people are also poor.
There seems to be no reason why the elderly should be treated less generously than young people. Therefore, the government should consider providing laptop computers and free internet access for this section of the population. In fact, the American research evidence suggests that this should happen immediately, rather than after a pilot scheme. Unlike the scheme for children, the pay-back would be obtained almost immediately with lower NHS costs.
In the meantime, library managers should note that reading does give a good stimulus to the brain. Depriving the public of reading matter by reducing book stocks is bad for the health of the elderly.
More information: https://www.telegraph.co.uk/news/newstopics/debates/3195748/The-internet-beats-books-for-improving-the-mature-mind-say-neuroscientists.html
AT Oct 08
William Patry Withdraws
William Patry has been a sensible and knowledgeable contributor to the copyright debate for the last four years, but has now given up the struggle. Why?
He gives two reasons:
1. People do not accept that he is commenting in his personal capacity. As he is a senior Google copyright lawyer, this is not very surprising. His claim that he is commenting as a private individual is probably true. However, that individual has to have been influenced by the environment in which he spends most of his day i.e. by the Google corporate ethos. Actually, the fairly moderate views expressed by Mr Patry reflect well on his employer.
2. William is too depressed by the state of copyright law to continue the
fight. His analysis is:
Copyright law has abandoned its reason for being: to encourage learning and the creation of new works. Instead, its principal functions now are to preserve existing failed business models, to suppress new business models and technologies, and to obtain, if possible, enormous windfall profits from activity that not only causes no harm, but which is beneficial to copyright owners. Like Humpty-Dumpty, the copyright law we used to know can never be put back together again: multilateral and trade agreements have ensured that, and quite deliberately.
It seems that we are all doomed to a continuing regime of slightly restrained exploitation. This guarantees that widespread piracy will also continue.
Epitaph for the E-Book?
Attempting to sell people something for £400 that merely enables them to read something that they won't buy at one hundredth of the price seems to me a thankless task," This is Nick Hornsbys epitaph for the e-book reader appearing in a periodical/blog near you.
Nick obviously prefers dead-tree books to electronic books. However, he has a good point about the cost. Time will solve that problem though, just as it will rectify the rather klunky appearance of present day electronic offerings. Compare the large, ugly and expensive early mobile phones with todays slinky, affordable versions. The same development curve applies to electronic book readers.
It is too soon to write an epitaph for the electronic book. It is almost too soon to say it has been properly born. The examples that are around at present probably bear only a passing resemblance to the final products which will be widely accepted. This brings us to the mystery of the Kindle sales volume.
I have actually seen someone using a Kindle. I had to get on an aeroplane
to do so, as it has been totally invisible in the UK. Even in the US, the Kindle
is a rare sight. Amazon says that it is very happy with Kindle sales, but it
refuses to provide any figures. Amazons strange marketing strategy for
the Kindle continues (see Amazon's Kindle).
AT July 08
Accessibility and ICT
Guidelines have recently been released intended to make ICT systems more usable by older and disabled people.
The areas of disability covered by the guidelines include deafness (partial & total), blindness (partial & total), physical, cognitive and language. Older people tend to suffer from one or more of these impairments to a lesser degree and these are also covered in the guidelines.
More information: https://www.tiresias.org/research/guidelines/index.htm
AT June 08
MicroSoft Stops Scanning Books
On Friday 23rd May, MicroSoft announced that it was withdrawing from the Open Content Alliance (see Open Content Alliance below) and would stop scanning books. MicroSoft was the dominant member of the Alliance and this must be a grave blow to its operations. MicroSoft is, at present, scanning books in the British Library and it is not clear if this will continue.
The other major book scanner, Google, will now be the undisputed leader in the field.
More information: https://www.theregister.co.uk/2008/05/27microsoft_google_book_monopoly/
AT May 08
Further evidence of the changing make-up of the internet population has been
published. A research report An Overlooked Audience" has been released
jointly by Focalyst and Dynamic Logic and shows that the 17 million US citizens
born before 1945 spend a total of 750,000,000 minutes per day on the internet,
i.e. on average 44 minutes per person per day. The increasing popularity of
internet use among the mature section of western populations appears to be
a firmly established trend - see The Greying of the Internet & UKs
Internet Market Matures below. Conversely, the popularity of internet
activities among the very young appears to be declining. Perhaps it is the
more solid, mundane tasks that mature users perform on the internet that is
the secret of its steady rise in popularity in this section of the population.
There is nothing ephemeral about these uses:
|Contact family and friends||59%|
|Health and health-related information||38%|
|Exchange photos with family/friends||33%|
More information: https://www.dynamiclogic.com/na/pressroom/releases/?id=605
AT May 08
Better Use of Libraries Book Stocks
A universal cause for concern of Library Friends Groups is the state of the book stocks in their libraries. Library managers find it easier to reduce book stocks when budget cuts have to be made, rather than carry out more painful adjustments. Thus, repeated cuts are made to leave public libraries in a parlous state. The library users react to this in a quite rational way they go elsewhere for their books, if they can. This means that the middle class deserts the public libraries and the less prosperous are left with a progressively poorer range of books to borrow.
Of course, rebuilding book stocks is the only long-term solution to the problem. In the meantime, short-term measures can be taken to alleviate some of the symptoms of the illness. One of these short-term measures would be improved utilisation of the remaining stock.
It is a boast of professional librarians that they are able to point library users towards alternative authors, when those users find that the books of their favourite writers are no longer on the library shelves. I am not sure how many of the public actually use this opportunity, although I suspect that very few do. There has certainly been no attempt to publicise this redirection service in any public library that I have visited recently. This is a pity, as the users would gain immensely and, I think, library staff would find life would become more enjoyable (they would be talking about the subject in which they have a deep interest).
The danger of making a determined effort to encourage users to seek advice is that this activity may become too popular and swamp the staff. This will not be a danger for long, as there is a new, infant service which automatically does the same thing. The BookLamp internet service uses software to analyse the contents of books in a similar manner to the Music Genome Project which has been analysing music since 2000. The software records writing style, tense, perspective, action level, description level and dialog level. The user simply gives the software an example of a book that he/she has enjoyed and receives back book recommendations.
The effectiveness of the new service is dependant on the sophistication of the software and the number of books which have been scanned. At present, only a few hundred books have been scanned and this, obviously, limits potential choice. However, the number is increasing continuously.
The BookLamp scheme is a good idea, but it is probably a little too late to succeed as a stand-alone project. Both MicroSoft and Google have massive book scanning operations underway and have already scanned many thousand books. It would take very little effort for either or both to adapt their software to mimic that of BookLamp. The intense rivalry between the software giants will ensure that this quickly happens. While we wait for the definitive automatic book recommendation system, there is no reason why the old-fashioned, reliable manual system should not be used more.
AT April 08
More information: https://beta.booklamp.org/ , https://www.pandora.com
Socio-economic Barriers to Internet Usage are
A report on broadband access to the internet, published by the Organization for Economic Co-operation and Development, has found that some socio-economic barriers to internet usage are disappearing. Factors such as education, income, age, gender, and place of access have influenced internet usage and broadband access in the past.
Previous studies have found that better educated individuals were more likely to adopt broadband than those with less education. However, this study has now concluded that education is less of a factor affecting internet adoption
While older people may use the internet as much as younger people, the elderly are less likely to go online for shopping and entertainment. A person's retirement age appears to be more of a factor than the person's actual age, the study found. Thus, the tendency for people to retire later has increased the use of broadband among the elderly. Presumable, the higher income from continued employment and the opportunity to obtain internet access at the workplace are the determining factors.
The report has found that men are more likely than women to download software, and women are more likely to engage in health-related activities and online shopping.
There are also gender differences in where internet access occurs. Men are more likely to gain access at both home and work in many of the OECD countries. Women are more likely to access the internet from educational establishments
More information: https://www.oecd.org/dataoecd/44/11/39869349
AT March 2008
Rollup E-Reader is Here - Almost
Polymer Vision, a spin-off company from the Philips group, has announced that it intends to introduce a cell phone/e-reader/MP3 player in the middle of this year. The new reader, called Readius, will differ from the Amazon Kindle and Sony E-Book in that its monochrome display can be rolled-up. The display, when unrolled, will have a diagonal measurement of 125 mm to give it a size similar to an A7 sheet of paper and just over half the size of a paperback page (diagonal approx. 210 mm) i.e. large for a cell phone but a little small for a book. The small size will be an advantage in crowded commuter trains, as it allows single handed use.
The new device will have mass storage provided by a Micron SD card rather than a micro hard disc and this will give a memory of up to 8GB . Thus, weight (115g) and thickness have been kept to a minimum.
The Radius is closed by folding the display around the core containing the battery and electroncs. The overall dimensions of the closed e-book are 115 x 57 x 21 mm and are small enough to allow the reader to be easily carried in a pocket / handbag / briefcase.
AT Feb 08
UKs Internet Market Matures
The market research organisation Nielsen Online has found that the UKs active home internet population fell in November by 1.14%. Of the 10 developed countries that the Nielsen survey covered, there were only two countries to exhibit a fall. The other non-trend country, Switzerland, restricted its fall to 0.81%. Overall, the developed worlds active domestic internet population increased by 1.96% in November, with Brazil (+8.31%) and Australia (+7.28%) showing the greatest growth.
Another survey from Nielsen has shown that, for the year Oct 06 Oct
07, the under 25 year-olds share of the UK Internet population has decreased
from 29% to 25% - a relative drop in share of 16%. During the same period,
the share made up of over 55 year-olds has increased from 16% to 19% - a relative
increase of 22%. Overall, the average age of the UK home internet population
has risen from 35.7 to 37.9 in the period. This result is very similar to that
of a survey of US internet users taken two years ago (see The
Greying of the Internet below).
The Nielsen surveys suggest that the falling UK internet usage is mainly caused by the younger part of the population becoming less active in the medium. If this is an accurate picture of the whole of the 2 24 year old group, it has profound implications for the reported falling child literacy levels in England and Scotland attributed by the National Foundation for Educational Research (NFER) to high internet usage levels by young children.
Of course, it is too soon to celebrate. It is not certain that the NFER is correct in blaming the internet for falling child literacy levels and, if it is, the fall in youngsters internet usage needs to continue in order to return the UK to its previous literacy performance.
AT. Dec 07
Amazon has launched its Kindle e-book reader to mixed reviews. The most repeated criticisms are that it is ugly and expensive. Of course, beauty is in the eye of the beholder and one will never get complete agreement on this aspect. However, cost is another matter. At $399 in the US, Kindle costs approximately the same as a reasonable, low-end laptop computer. Unfortunately, it does not have the versatility of the computer. The problem of the high initial purchase price is further compounded by a fairly high cost for individual books ($9.99). Amazon is certainly not trying to follow George Eastmans proven marketing strategy one used very successfully, not only by Kodak, but by the mobile phone companies virtually give away the camera/phone and make lots of money on the subsequent consumable purchases.
Amazon is a marketing driven organisation. So, it is strange that the Kindle launch should be very low-key and even stranger that Amazon should say that it does not expect large volume sales. It looks as though Amazon is simply testing the market and that it will make its serious move when it knows more about the potential.
More views: https://www.publishingnews.co.uk/pn/pno-news-display.asp?K=e2007112212422839&TAG=&CID=&PGE=&sg9t=67ef20921fbab1c4a1b9ad91edce950e
AT. Nov 07
The Future of the Book
For some time, this webpage has been propagating the view that the ubiquitous printed book is due to morph into an equally popular electronic book. Unsurprisingly, this view appears to be more widespread in the USA than on this side of the Atlantic.
In the USA there is even a think tank dedicated to investigating the evolution of the book. This organisation, the Institute for the Future of the Book, has now opened up shop in London based on the London School of Economics.
The change from ink on paper technology to electronic technology has huge implications for public libraries and the arrival of the Institute should help the expected transition to occur with a minimum of disruption. Of course, for this to happen, the Institute needs to spread its message far outside the academic world. However, the academic libraries should be able to pioneer the necessary changes more easily the public ones.
More information: https://www.futureofthebook.org/
The Publics Views on Technological Innovation
At the beginning of May 2006, the government (Department for Innovation, Universities & Skills) commissioned a project called the Sciencewise programme in order to explore the publics views on the wider implications of themes in science and technology that had emerged from previous strategic horizon scanning work. The project report has now been published.
The projects primary aims were:
to discover views about the issues raised by possible future directions for
science and technology, from a broad set of participants,
to inform policy and decision-making on the direction of research and the
regulation of science and technology,
and to help identify priorities for further public engagement on areas of
science and technology.
Its secondary aims were to:
widen public awareness of the role of science and technology in shaping the
future of the UK;
improve public confidence in the Governments approach to considering wider
implications of science and technology;
increase understanding of the value of public dialogue in shaping policy and
decisionmaking in science and other policy areas;
improve understanding of how to engage large numbers of people in
discussions and dialogue on science and technology-related issues, particularly
issues arising from new and emerging areas of science and technology;
strengthen coherence and collaboration among science engagement
As digital technology is beginning to have a significant effect on the operation of pubic libraries in the UK, this project is potentially helpful in predicting the reaction of the public to any proposed changes.
During the first half of 2007, a national series of public conversations was
held about new technologies and their impact on future societies. These consisted
a Deliberative Panel with a diverse group of some 30 members of
the public and invited expert speakers
Facilitated Public Events in science centres and other community
Self-managed Small Group Discussions run by community bodies
such as schools, Womens Institutes and faith groups.
The projects findings were:
There were no striking divisions of opinion by social class, gender, ethnicity or age. Broadly, people were likely to be positive, with important qualifications, about developments in science and technology that seemed to promise gains in choice, quality of life, longevity, convenience, time-saving and environmental impact. Potential impacts on social equity, freedom, privacy and human autonomy and skills were regarded with considerable suspicion or hostility. A minority felt largely anxious about the perceived domination of young people's lives by computer technologies, and the potential for a dehumanised future.
The approved aspects of technological developments are unsurprising. The perceived negative aspects appear to be associated with social engineering, political correctness & (alleged) public safety and are, presumably, reactions to policy makers past mistakes. Technology can often be used, with equal ease, to benefit or to damage the public. It is the decisions of the policy makers that determine the balance. Too often, these people make their decisions with little understanding of the technology and its potential for good or bad. If the UK is to remain a leading economy, there is no doubt that it will have to enthusiastically embrace new technology. It would be extremely helpful if the UK could also improve its policy-making machinery.
AT Sep 2007
Mobile Phone Masts Proved Innocent
IEI-EMF is defined as idiopathic environmental intolerance with attribution to electromagnetic fields. It is a condition in which an individual experiences non-specific symptoms and attributes the cause of these symptoms to exposure to electromagnetic fields. This is the type of reaction that is usually blamed on mobile phone masts or WiFi antennas and includes tiredness, tension & anxiety.
Scientists at the University of Essex have investigated the effect of mobile phone signals on 44 people who complained of these symptoms and 114 people who had not. The research team, which included psychologists, engineers and a doctor, found that switching a mobile phone mast on and off made no difference to measurements of heart rate, blood pressure & skin conductance.
The group which had previously experienced the symptoms reported that they felt less well when they knew that the mast was active. However, when neither researchers or subjects knew when the signal was present, there was no on/off difference in the number of symptoms reported. The sensitive group reported more symptoms, of greater severity than the control group but this was unaffected by whether the mast was on or off.
The researchers found that the sensitive group had a higher skin conductance. As skin conductance is a good measure of physiological response to environmental stress, they concluded that the symptoms experienced were real, even if the cause was not the mobile phone signal.
The university exposure system and testing environment was verified as accurate by the National Physical Laboratory.
Although the study did not use the WiFi frequency bands, it does tend to suggest that the fears expressed about the widespread use of WiFi are unfounded. It certainly provides no reason to halt the deployment of WiFi in public libraries. Of course, we are still left with the intriguing mystery of the true cause of the symptoms.
The study was reported in the journal Environmental Health Perspectives and
can be found at:
Subsequent to the above posting, the governments Mobile Telecommunications and Health Research (MTHR) Programme published its conclusions - The six year research programme has found no association between short term mobile phone use and brain cancer. Studies on volunteers also showed no evidence that brain function was affected by mobile phone signals or the signals used by the emergency services (TETRA). i.e. the Essex University research and MTHR research reached similar conclusions.
French company, Nemoptic, has announced the availability of a 210x297mm e-paper display. This bistable, A4 size, LCD display is less than 2mm thick. As the display is bistable, it only draws power from its supply when the image is changed.
At 200 dpi, the image resolution is similar to that of print on paper. The brightness of the image is greater than 30% (brightness is a relative measure of whiteness where black = 0% & white = 100%)
The display refresh time of less than 1 sec is only slightly longer than the time necessary to turn a page of a book, so the module can provide a practical basis for an electronic book reader.
The technology developed by the French company has been licensed to Seiko
Instruments for high volume production and appears to give a slightly higher
resolution than the Philips technology being used by currently available e-books
(see Electronic Books below). However, the main
advantage is the greater display area. One possible disadvantage, when compared
newer display modules is
the lack of flexibility of the Nemoptic device (it has thin glass sheets within
it, whereas the Philips display uses a flexible metal foil).
AT July 07
Laptop Computers Without Hard Drives
There are persistent rumours circulating that APPLE may, later this year, introduce notebook computers without hard disks. The replacement of the hard-disk drives by flash memory will have the advantages of removing lengthy start-up times and will reduce weight. There is also the possible benefit of improved reliability and battery operating times.
The laptop computers would use the same type of fast memory as music players and digital cameras. Their memory size has been increasing and the component cost has been falling rapidly although, for large memory sizes, hard drives still have a small price advantage.
AT. June 07
The CRT Display Lives on, in Disguise
In December 2006, Toshiba announced that it was on track to mass-produce SED TV sets by 2008. Production will be in cooperation with Canon, which has been working on the technology since 1986. Small-scale output will commence in 2007, but when the inevitable computer monitor derivatives will become available is not yet clear.
Each Pixel of a SED (surface-conduction electron-emitter display) is an individual device which produces electrons to excite a surface phosphor coating, in a similar way to a CRT. Each device consists of a thin slit across which electrons tunnel when excited by moderate voltages (tens of volts). When the electrons cross electric poles across the thin slit, some are accelerated toward the display surface by a high voltage gradient between the display panel and the surface conduction electron emitter. The energy carried by the electrons is given up to the phosphor on impact to produce the pixel.
The claimed advantages
for SEDs are:
Very high contrast ratios (50,000:1 to 100,000:1)
Exceptional 1ms response time
180 degree viewing angle.
Brightness of 450 NITS (Candela per meter square)
Municipal WiFi Progress
Singapore is the latest world-class city (actually, it is a nation with a population of 4.6m people) to commit itself to providing a wide coverage WiFi service for its citizens. The estimated set-up cost of the system is S$100m (£33.2m), with S$30m of this being provided by the Singapore government. The remaining S$70m will be supplied by three telecom operators, SingTel, iCell & Qmax. Internet access will be available in most public areas at speeds of up to 512kbps. This is rather slow broadband but it will be free for at least two years from the start date of September 2007.
The stated aims of the new development are to broaden the opportunities for all segments of the population to access and benefit from technology, together with the determination to create digital opportunities for all Singaporeans and never allow a digital divide in our society. Worthy goals that London should take heed of self congratulation for the success of the Peoples Network is now beginning to look a little self indulgent, as the UK digital divide reopens in the WiFi access area.
To ensure that the poor of Singapore do actually benefit from the WiFi network, the government will offer 10,000 subsidised computers to low-income families with school age families.
Of course, local authorities are not just concerned about the plight of the poor. They provide services for all and to do that they have to invest in a wide range of assets. Many of these assets are mobile, e.g. cars, vans, lorries & buses, and tracking them is a difficult problem. Nortel, a large equipment supplier to internet service providers, has announced what it considers to be the answer to the problem.
Nortel has integrated active RFID tags with GPS receivers. The tags communicate with a tag reader which then uses the municipal WiFi network to report the position of the asset. Such a system does not have to be limited to tracking assets which are inherently mobile. A normally fixed, high value asset which suddenly becomes mobile is probably being stolen and its whereabouts is of great interest to the police.
More information: https://www.ida.gov.sg/home/index.aspx
AT, Dec 06
Changing Cost of Broadband Services
A report issued by Point Topic at the beginning of December showed that the worldwide cost of broadband delivered by optical fibre has fallen below that of cable.
The report is based on research carried out by Point Topic up to the end of September and found that the rate of decline in cost of DSL and cable delivered broadband has slowed slightly, wheras the rate of decline for fibre delivered broadband has accelerated. The result of these changes is that cable is now the most expensive and fibre is the second most expensive. The cost of DSL services, carried over old fashioned telephone lines, is still significantly below the competing methods, but fibre is rapidly closing the gap.
Currently, the purchasing power parity* monthly rental costs are:
Optical fibre $28.1
*purchasing power parity is a method of adjusting currency exchange rates in order to equalise the purchasing power of the currencies.
Coincidently, or possibly
not coincidently, BT has announced that it has recently installed 2,300 km
optical fibre as part of its IP based network transformation.
More information: https://www.point-topic.com/content/dslanalysis/061206benchmark.htm
The Financial Effects of Implementing RFID Technology
For some time, academic and public libraries in the developed world have been implementing rfid technology to manage their stocks (see RFID below). Last month, the public libraries of the city of Hamburg became the latest to start introducing it.
The advantages of rfid systems are obvious but it has been rather difficult to obtain reliable, quantified information on how these advantages affect overall performance. The Dutch bookshop chain Boekhandels Groep Nederland (BGN) has now published the results of a trial that it has carried out in one of its shops
BGN tagged the stock of its Almere outlet from April 06 and found that, in the following 6 months, its sales increased by 12%. This improvement is attributed to the ease with which customers were able to locate books.
The store has also cut back its inventory time for each box of books, from four minutes to a few seconds. Misplaced books are also found faster. If these cost reductions were replicated in all 42 BGN stores, handling 7 million books per year, it is estimated that $3.8 million per year would be saved.
The increased turnover and reduced costs have resulted in plans to introduce the technology in a further 16 BGN stores.
AT, Nov 06
More information: https://www.rfidlowdown.com/libraries/index.html
Internet Explorer 7
Microsoft Corp. released Internet Explorer 7 on 18th Oct. This is the first major upgrade to its Web browser since 2001 and is designed to counter the inroads that Mozillas Firfox product has been making in a market segment dominated by Microsoft (See Browser Wars & Mozilla Gaining Ground below).
Firefox features such as an integrated search window (allowing users to carry out a Web query without opening another page), tab browsing (allowing toggling between different sites) and a pop-up window blocker have now been incorporated in IE7. Unsurprisingly, IE7 uses Microsofts Windows Live as the default search engine, whereas Firefox uses Google as the default.
One of the advantages claimed for IE7 is the ability to restore work after a browser/PC crash.
Microsoft now has a product which can compete very well with the current Firefox browser. However, Mozilla plans to release an upgraded browser, Firefox 2, within the next few weeks.
A recent survey by OneStat.com
has found that Internet Explorer has a 86% global share and Mozilla Firefox
11.5%. Within the very important U.S. browser market, 80.77% of users surf
the Web on IE and 14.88% on Firefox. Firefox is most popular in Germany (33.4%);
Australia (25.5%); and Italy (21.6%).
IE 7 is available immediately
to Windows XP users and it will eventually serve as the default browser for
Microsoft's much-heralded Windows Vista operating system, due to be released
to consumers in early 2007.
IE7 can be downloaded from www.microsoft.com/ie.
Browser market survey is available from www.onestat.com
AT, Oct 06
European Digital Library
The European Commission has called on Member States to contribute to the European digital library by setting up large-scale digitisation facilities to accelerate the process of getting Europe's cultural heritage on line via the European Digital Library.
At present, only a small fraction of the cultural collections in the Member States is digitised but, by 2008, it is planned that two million books, films, photographs, manuscripts, and other cultural works will be accessible through the Digital Library. It is expected that his figure will grow to at least six million by 2010 and will ultimately be much higher, as potentially every library, archive and museum in Europe will be able to link its digital content to the new library.
Obviously, the input from the UKï¿œs public library services will be limited. However, the academic libraries have an enormous amount of suitable material and a good start has been made in digitising this. It is to be hoped that the Google / MicroSoft initiatives in this area will be complementary rather than competitive (see: Open Content Alliance & Google Flexes Muscles ).
The European Library portal is at: https://www.theeuropeanlibrary.org/portal/index.htm
Further information: https://europa.eu/rapid/pressReleasesAction.do?reference=IP/06/1124&format;=HTML&aged;=0&language;=EN&guiLanguage;=fr
AT Sep. 2006
The Bloomberg New York City administration is slowly becoming more receptive to the concept of providing WiFi access for the public. Unlike other large US cities (e.g. San Francisco, Pittsburgh and Sacramento) where free access is becoming the norm, NY City has been reluctant to do more than pontificate about the supremacy of private enterprise in the USA and it has been left to the students of Monroe College and local community groups to set up hot spots in the city.
Now, the city no longer expects to make money from any WiFi ventures in its jurisdiction and, by the end of the summer, ten of its parks will have WiFi access provided (eight locations within Central Park). In addition, squares in Manhattan, the Bronx and Queens will be covered.
A Senator representing New York State, Chuck Schumer, has stated that he will introduce a bill to provide US federal funds to communities wishing to install WiFi systems citywide or countywide.
Reuters reports that the mayor of Paris is planning to install 400 WiFi access points in the citys public areas such as parks and libraries. Some free access is expected when the system goes live next year.
Pity poor London.
The semiconductor arm of Philips, the multinational electronics group, has announced the launch of its next generation of rfid (radio frequency identification) chips. The company, the worlds largest producer of rfid chips, is targeting the library market for its new product via collaborating hardware manufacturers and system integrators.
Rfid technology has been around for some time but has only made a small impact on the library world (there are only a few hundred libraries using it, worldwide). Probably, the reason for this is the fairly high cost. However, costs are falling as the retail market increasingly adopts the technology. The retail market segment will need very many million items per year and economies of scale will result. The standardization of library system technology that is both ISO15693 and ISO 18000-3 compliant will also help in rfid take-up in libraries.
North American libraries are leading the adoption of rfid technology, but the Netherlands has recently decided that its libraries would also utilise it. UK libraries seem to be taking a wait and see stance.
For hard-pressed UK local authorities, a reduction in the cost of book issuing is potentially the most attractive aspect of rfid technology. A borrower does not have to struggle with a bar code reader to take out /return a book via a self-service terminal. So these terminals become much more user-friendly and can be deployed on a wider scale. Indeed, the 2 metre range of the new devices suggests that dedicated terminals may not be necessary anyway. Grouped library catalogue computers could possibly be used or even Peoples Network computers.
One of the most frustrating aspects of trying to use a public library as a reference source is the mismatch between the library catalogue and the actual stock on the libraries shelves. Although staff do periodically check that books which have not been issued for a given duration are actually on the shelves, this is only a partial remedy and it is quite time consuming (costly). With rfid, proper stock-taking can be quickly accomplished and, in theory, could be undertaken each day to ensure efficient stock management, at almost zero cost. The remote searching of on-line catalogues would thus become far more useful.
AT, June 06
The alpha 3 release of Mozillas Firefox 2 browser is expected to become available on 25th May. As an alpha release, it is still in the development phase and is still some distance away from a final release.
The current full release of Firefox (version 1.5) has been nibbling away at the market dominance of version 6 of MicroSofts Internet Explorer (see Mozilla Gaining Ground below) and MicroSoft has been working hard to bring out an upgrade to its offering to counter this (IE7 is now at betta 2 release). Both IE7 and Firefox 2 are believed to have final release dates at the end of the year.
The more seamless operation with the Windows operating system that MicroSoft can offer with its browser product is a huge advantage, so Mozilla has to offer other advantages to offset this. MicroSofts position has been further strengthened in this respect by Apple computers now running the Windows system.
AT May 06
Nicholas Negroponte, a professor at the MIT, has been developing the concept of low cost computers for third world countries for some years and reported project progress at the recent LinuxWorld event.
The Negoponte led one laptop per child (OLPC) initiative has obtained $29 million in funding for engineering. It is expected that the project launch will be in 2007 with shipments of between 5 million & 10 million units initially in China, India, Thailand, Egypt, Nigeria, Brazil and Argentina.
The key to achieving the target $100 price for the laptop is the huge volumes needed up to 100 million per year. Initially, the computer will cost $135, although it is expected that this will fall to hit the $100 target by 2008. The cost may even drop as low as $50 in 2010.
Negroponte has pointed out that 50 percent of the cost of a normal notebook is attributed to sales, marketing and distribution costs. OLPC claims that it has no such costs. In addition, OLPC expects to avoid the 25 percent of total cost contributed by a Windows operating system license by using Linux instead. The remaining 25 percent of the total normal cost is that of the display. OLPC will reduce this by utilizing a dual-mode display that is both reflective and transmissive i.e. reducing the backlighting requirement.
The $100 laptop computer will have a 500MHz AMD x86 processor and 128MB of DRAM & 512MB of Flash memory. It will have a power drain of less than 2 watts, which is expected to be generated by windup power, similar to the successful radio designed for the third world market. Wi-Fi connectivity will be provided and there will be three or four USB ports.
Although the $100 laptop does not appear to be very attractive for the more developed markets such as the UK, the enormous component volumes that will be required for its manufacture will have a great influence on the design of computers intended for those markets and will, inevitably, result in further price erosion there.
There has been much high volume comment from, and inspired by, large US telecomms. companies about the supposed unsustainability of the business model used by schemes offering free wireless broadband internet access to city communities. Whilst these incumbent service providers heavily criticize the pioneering municipal WiFi/WiMax schemes, they are quietly beginning to follow the same strategy themselves. Sprint is now setting up two free wireless internet access zones in a suburb of Las Vegas to test the idea before rolling it out. Verizon and Time Warner are also companies that have had an unpublicised change of heart. Of course, this about turn has nothing to do with the fact that large, non-telecoms companies (such as IBM & Google) are showing intense interest in the sector, or has it?
AT, Mar 06
Six bids for the San Francisco citywide WiFi system were received by the city on 21st February. As predicted, Google was one of the bidders (see Google Flexes Muscles ). The search engine company has teamed up with EarthLink, the well-known ISP, to propose a two level system. Google would offer a free, slow service (256 to 384 Kbps), whilst the Earthlink service would be fee based but faster (1Mbps in both directions). The companies would share the cost of deployment and operation.
The powerful SF Metro Connect alliance (IBM and Cisco Systems, together with SeaKay), has also submitted a bid for the San Francisco contract. It is believed that this also has free access provision. Other proposals have been submitted by Communication Bridge Global, MetroFi, NextWLAN and Razortooth Communications.
San Francisco is not the first US city to launch a citywide WiFi project. Philadelphia, Anaheim (both EarthLink projects) and some other major cities have also done so and it is expected that about 25 major US cities will have been unwired in the next two years. Although these are largish cities, they are generally far smaller than London (San Francisco has a population of less than 1 million people).
New York City is closer in size to London, with a population of just over 8 million people and has an enthusiastic group of Councillors advocating the construction of a citywide WiFi system. However, the free marketer Republican mayor, Michael Bloomberg, believes that a private sector group should set up a system, if it is needed. He has, therefore, been placing obstacles in the way of efforts to implement a community run operation. Discussions have been going on for over three years and the only progress has been the introduction of a City Council Bill to set up a task force to investigate the matter.
Bloombergs hope of a private sector WiFi solution for NY City has not produced any firm response from companies. EarthLink has stated that it was theoretically interested, but would prefer to have the city as a partner.
Not all influential people in the USA agree with the stance of the NY Mayor. Two Bills were recently introduced into the US Senate to allocate more of the available RF spectrum to community use.
Whilst the NY citywide WiFi scheme may only be crawling slowly forward, a London scheme has yet to even get to the public discussion stage. There is a great danger that the enormous opportunity presented by the 2012 Olympic Games for setting up a London-wide WiFi/WiMax system will be lost. The advertising possibilities open to any organisation, such as Google, willing to put such a system into London by 2012 would more than compensate for the large initial cost. Thus, London is in a good position to get a very good deal, but only if it moves in time.
America Online and Yahoo have announced that they intend to introduce a preferential email service. They intend to offer privileged treatment to companies which are prepared to pay for the delivery of email messages.
The internet companies allege that the preferential service will help them identify legitimate mail and cut down on junk mail. Presumably, all unpaid emails will be defined as junk mail. This perception is reinforced by the intention of AOL to allow preferential emails to bypass its spam filters and to delete links & images from most unpaid messages.
The new proposals seem to offer very little advantage to bulk email senders (the prime target) over the far cheaper, fixed fee services already provided by other companies.
Of course, one suspects that the new proposals are all about the huge additional revenue AOL and Yahoo will generate if they succeed in their ploy.
The principle of net neutrality states that network owners should remain neutral with respect to the content they carry. There has not been any outright dissent from this principle. However, AOL and Yahoo seem determined to ignore it. Additionally, executives at three of the largest US network operators (BellSouth, AT&T and Verizon Communications) have also suggested that large content providers should pay extra for priority use of their networks. The idea that service quality levels should only be guaranteed at a price has caused the US Consumers Union, the Consumer Federation of America and Free Press to call on Congress to enact net neutrality legislation. The US Senate has started taking evidence and a Google executive has argued forcefully against the suggested changes to current practice.
If the companies
proposing the changes are successful, it is to be expected that other internet
companies will then follow suit and the free services currently available
will become progressively worse and worse until they are finally discontinued.
There is no doubt that a great blow will have been delivered to the vision
of the open world wide web which was generally accepted such a short time
This year will see the beginning of the end for MicroSofts Windows XP operating system. On the last day of 2006, MicroSoft will stop supporting WinXP Home - it says. Support for WinXP Pro, the more expensive version, will continue until 2011.
There is some doubt about whether MicroSoft will actually carry out its published program, as it has failed to live up to past, similar declarations concerning Win98. After several stays of execution, MicroSoft now intends to stop issuing new patches, updates and fixes for Win98 on 30/06/06 and to remove the self help pages from its website one year later. It probably means what it says about support for Win98 this time.
After a life of eight or nine years, it is reasonable to retire Win98. However, the original retirement dates were obviously determined by MicroSofts income requirements rather than any great market need.
The retirement dates of WinXP are again being driven by cash generation considerations the release of Windows Vista (formerly Longhorn) is expected to be announced in the autumn, with availability aimed at Christmas 2006. The major specification improvements promised for the new version are greatly improved security and, for gamers, 64 bit operation.
Providing that the applications running on an operating system are performing well, there is no great urgency to upgrade. Most current application software still supports Win98 and, of course, all of it that is aimed at the Windows system (rather than the Mac or Linux systems) supports WinXP. However, it is not absolutely necessary to ever upgrade Windows, one could simply change to Linux, at no cost. Free versions of the Linux, such as Mandrake and Centos (Community ENTerprise Operating System) can be downloaded. These are based on the Red Hat Linux source code - quite legally, as Red Hat is released under the standard Linux GNU Public Licence and they are regularly upgraded, just like other operating system software. If the Linux operating system is coupled with Open Office , Mozilla's Firefox browser and Grisoft's AVG antivirus package, 95% of software for a good home PC can be obtained free. There may be a very small cost for a business PC, but it would be insignificant compared with continuing to use Windows.
few days after the above article was posted on the LLL website, MicroSoft
the 31 Dec 2006 date from the XP support timelines. The MicroSoft website now
states that mainstream support will end two years after the next version of
Windows is released. This should now push support for WinXP out to the end of 2008 / beginning of 2009, as a minimum.
report from the European Interactive Advertising Association was issued
at the end of 2005 which compared the popularity of the different types
of media on offer in Europe.
The report found that the European usage of online media grew by 17% in 2005 compared with TV at +6%, radio at +14%, newspapers at +13% and magazines at -7%.
France was found to lead the league table of internet usage with the average consumer spending 13 hours per week online. The UK and Spain were placed second in the table with the average consumer spending 11 hours per week online. Italians were the least enthusiastic internet users with an average of 8 hours per week online.
One surprising finding in the report was that already 10% of Europeans use VoIP (Voice Over Internet Protocol) for their telephone calls. This rapid take-up of the technology must be causing concern in BT and the other European wired & wireless telecom. companies.
More information: https://www.eiaa.net
AT, Jan 06
Microsoft has joined the Open Content Alliance (OCA) and Yahoo, to become a rival to Google (see Google Flexes Muscles ) in the scramble to digitise the world's libraries. As result of the collaboration, Microsoft will launch MSN Book Search next year. Other contributors to the OCA initiative include Adobe Systems, Harvard University, The Royal Botanical Gardens at Kew, the Smithsonian Institution, the UK National Archives and York University.
MSN Book Search will also deliver results from academic libraries, periodicals and other print sources, putting the service in direct competition with Google.
One of the first significant outcomes from the Microsoft initiative is an agreement with the British Library to digitise its out-of-copyright collection and to provide search facilities for it. During 2006, it is expected that 25 Million pages (100,000 books) will become available on the internet from the British Library. This is only a small part of the British Library collection, but the project is expected to continue beyond 2006.
Of course, the Gutenberg project (see Books on Line ) has been making available on the internet out-of-copyright books for many years and public spirited authors have been increasingly bypassing the publishing houses by simply uploading their books onto websites. The new developments bring to bear the enormous resources of Google & Microsoft and this should ensure that progress will very rapidly accelerate.
AT Dec 05
2.0 Has Arrived - Free!
OpenOffice 2.0 final was released at the end of October. By default, it uses a vendor-independent format to save documents - utilising the open standard OASIS OpenDocument XML format. This file format is recommended by the European Commission.
As a direct competitor to Microsofts Office suite, OpenOffice has the overwhelming advantage of being FREE to download. In common with Microsofts offering, the OpenOffice package is large (75MB). Downloading via a broadband connection is therefore a practical proposition but doing so via a dialup connection would probably be a little difficult. Very cheap CDs of OpenOffice are often available and provide an alternative to downloading.
Microsoft has a near monopoly position in general office software used in library services and this is an unhealthy situation. Should not OpenOffice be seriously considered at the next upgrade?
Choosing Microsoft is the easy option it does not require any thought. However, this used to be the situation with IBM computers and now they are just one of the many that are in the market place. Why should the situation be different for software?
Google is stirring up hostility from the publishing world by extending its search technology to the world of books. Its new Google Print venture is intended to do for the print medium what it has done for the internet i.e. to point the user to the source of information in print. This idea was originally received by academic librarians with some enthusiasm (Oxford, Stanford, Harvard & Michigan universities have agreed to open their collections to Google), as the resources that Google can employ are enormous and this would obviously be, potentially, a great help in research tasks. The New York Public Library has also opened its collection to Google.
Books in English were the first Google Print target and it has already run into opposition from the (hidebound?) book establishment. The US Guild of Authors and three authors are suing Google for breach of copyright. Google believes that it has no case to answer, as what it is doing comes within the fair use definition, and it is pressing ahead with its program by expanding beyond the US / UK sources. Oxford University and the N.Y. Public Library have avoided the wrath of US authors by limiting Googles access to documents that are out of copyright.
It is difficult to see how Googles activities can be other than an advantage to the vast majority of genuine authors i.e. authors who have created original material. However, there are a small number of reference book compilers who may be able to show that scanning their works is a breach of copyright.
By embarking on a legal fight with Google over the interpretation of copyright law, authors could be opening Pandoras box. Copyright laws are very much more generous to authors than patent laws are to inventors and there is a school of thought that suggests there should be relaxation of the copyright rules.
A high profile copyright legal fight may well open up the discussion. One of the leading critics of copyright law, Lawrence Lessig, a professor at the Stanford Law School, has suggested that one cannot accept the activities of the various internet search engines as being legal without accepting that similar scanning is also legal for print media.
Alternatively, the converse argument may be true. If the scanning of books is illegal for the purposes of providing source information, the traditional activities of internet search engines are probably also illegal. It looks as though the action by the US authors strikes at the very roots of the internet.
Having upset the US Guild of Authors, Google then went on to join up with Sun Microsystems to create a software business that could prove a great threat to the worldwide domination of Microsoft. Microsofts Office suite market has been suggested as a likely target for the new group. It is some time since Microsoft has had a strong competitor, so it too is likely to be a little miffed.
Google has proposed setting up a free, citywide WiFi system in San Francisco.
This will upset yet another set of business people. Google will use its
usual advertising method of funding for the new venture. Could this be
the way forward for London?
Londoners are to be offered an ADSL2+ internet access service in October. This type of service will provide broadband access speeds up to 18Mb/s+. The new offering will not come from BT, which has been testing a similar scheme, but from Be and UK Online. The cost of the new service is expected to be between £20 and £30 per month.
BT is concentrating on rolling out its 8Mb/s service across the country and did not expect a full UK roll-out of ADSL2+ for some time.
Another new service for Londoners, which started operation this month, is a broadband wireless one from UK Broadband. The speed range of this service is 256kb/s (cost £10/month) to 1Mb/s (cost £18/month).
The IFA tradeshow in Berlin during the first week of September was the showcase for two new developments which could prove to be significant milestones on the road to practical, mass-market electronic books.
The first development is a 8GB 1 inch hard drive which provides a storage density of 105 GB per square inch. This Hitachi device is also claimed to consume 40% less power than its predecessor and to be highly shock resistant.
The 1 inch Hitachi drive is 5 mm in height and 13 grams in weight. So, it is very suitable for electronic products intended to be carried in pockets. The low power requirement and good shock resistance are also considerable advantages for this type of application.
The next size up from the 1 inch drive is the 1.8 inch format. The increase in diameter (the height is still 5mm) has enabled Hitachi to make a 30GB drive available now and to promise a 60GB drive in early 2006.
These new drives would provide sufficient storage capacity for a small to medium library of books, providing that there are only a limited number of photographs present in those books.
The second development is a flexible display called the Readius. This has been produced by Polymer Vision, a spin out from Philips in Eindhoven.
When not in use, the screen is rolled up and the overall size of the product shrinks to 100mm x 60mm x 20mm, approximately 1/3rd of the size of an average paperback book. However, when required for use, the device unwinds to provide a 4.8 inch (122mm) paper-like display.
The display is monochrome and uses the bistable E-Ink technology. The lack of colour is not really a great drawback for text, but the bistable nature of the display is a great advantage, as it would allow small batteries to be used. This is the same technology that is used in Sony's e-book reader, LIBRIe, which was launched approximarely 18 months ago (see EBOOKS )
Mass availability of electronic book readers, based on these and similar developments are still a few years away, but it is only a few years. When people have the capability of carrying around hundreds of books in their pockets, what will be the function of public libraries?
AT Sep 05
The first report, TCO for Application Servers: Comparing Linux with Windows and Solaris, states that the TCO of a Linux application server over three years is $40,149 compared with $67,559 for Windows and $86,478 for Solaris. These totals include costs associated with transferability of administrator skill and hardware architecture portability, in addition to the immediate installation costs.
Other benefits that Linux bestows are the flexible licensing model of Linux and its compatibility with a broad range of hardware platforms.
The second report, Beyond TCO: The Unanticipated Second Stage Benefits Of Linux, considers the skill based benefits of using Linux. IBM suggests that Linux is very popular with IT staff and especially with young IT staff. The reason for this is the great support for Linux in universities and colleges. Thus, graduates leave university with a good knowledge of Linux and recruiting them tends to be easier for organisations employing the system. Their managers also find the flexibility of the open source model an advantage.
Further information: https://www-1.ibm.com/linux/competitive/solarisToLinux.shtml
LAPTOPS ARE FOR
HOLIDAYS AS WELL
Tobacco, alcohol, drugs and now emailing have been found to be addictive. At least that appears to be the conclusion to be drawn from two recent US surveys.
An Intel sponsored survey carried out by Harris Interactive has found that 51% of respondents took their laptop computers on holiday with them. The vast majority of these people used their computers to send/receive personal emails and a rather sad 43% sent/received work related emails.
A second survey by America Online has found that 60% of US holidaymakers use their laptops to check their emails (13% for business purposes and 47% for personal reasons).
Running games, films and music were also uses found by America Online to be popular holiday time uses for laptops (56% of respondents).
Only 45% of the AOL respondents used their laptops to obtain travel information whilst travelling and 30% used them to edit holiday photographs.
The fact that a large proportion of US citizens refuse to be parted from their laptops because they cannot bear to be away from their email intrays has interesting implications for those public libraries in central London that are considering setting up WiFi zones.
AT, July 05
Public library internet provision is sometimes criticized as being predominately used for email. Whilst there is nothing wrong with this use, a recent survey suggests that the popular view may become outdated, if it is not now. The Pew Internet & American Life Project, 2005 has found that adult internet activities can be broken down as follows:
Research into products and services 78%
Read news items 72%
Look up DIY information 55%
Bid in online auctions 24%
Visit chat rooms 17%
Of course, this is an American survey and will not translate exactly to the UK. However, the UK usually follows where the US leads, so the survey can be used as an indication of the near future in this country. As the three most popular internet activities are surely also those which would be at the top of the list in any public reference library in the UK, the survey results should not be a great surprise. Perhaps there is something for the supporters of public library reference collections to worry about though.
The search for DIY information was found to go well beyond the expected home improvement and gardening themes. The advantage of the internet as a very broad source of information is clearly an important factor here.
More information: https://www.pewinternet.org/
AT, June 05
There are two developments slowly creeping up on internet users. They have both been around, in theory, for some time and are now becoming practical propositions. They each, of course, have the obligatory acronym without which it would be impossible for the internet community to take them seriously - IPv6 and ADSL2+.
Internet Protocol version 4, IPv4, has been in existence for a very, very long time. It provides the method of defining a unique address for the huge number of internet users that exist worldwide and now is being superseded by IPv6. The main reason for the change is that there are simply too many hosts on the internet and IPv6 provides an increase from 4 to 16 bytes (32 to 128 bits) for use in the address.
The migration of dial-up internet users to wideband internet use has resulted in these users changing from temporary holders of IP address numbers (automatically allocated by their internet service provider) to permanent holders of an IP address. The efficiency with which the available addresses are used has, therefore, declined at the same time as the overall use of the internet is increasing. Without the changeover to IPv6, this would be a recipe for disaster.
It may seem rather strange that the Internet Protocol version following version 4 should be version 6. Just as telephone system planners did not realise that they would eventually run out of telephone numbers, the internet planners were slow to understand that the same could happen on the internet. Thus, the IPv5 designation was allocated to a parallel development called the Internet Streaming Protocol before the potential problem was recognised.
With the introduction of IPv6, the internet user will not normally notice any difference in operation when he/she goes online. At least, that is the hope/expectation. However, there have been some problems reported. For instance, Microsofts Internet Explorer has been found to be unable to resolve domain names (the www.xxxx name) to IPv6 addresses for some servers.
Whilst the objective of the IPv6 change is simply to keep the internet operational, the objective of ADSL2+ is to improve performance. Most people who have a broadband internet connection actually utilize ADSL (asynchronous digital subscriber line) technology and it allows a download rate of 1 or 2 Mbit/sec & an upload rate of a few hundred kbit/sec to be achieved a significant improvement on the speeds available with a dial-up connection. That this transmission rate is fairly reliably accomplished on a basic POTS (plain old telephone system) copper network of the order of 100 years old (in London) is astounding. That there should be plans to accelerate the transmission rates by a factor of 10 to 20 seems unbelievable. However, the International Telecommunications Union (the international body overseeing telecommunications standards) approved the ADSL2+ standard in Jan 2003 to do just that.
The infrastructure upgrade necessary to meet the ADSL2+ standard will, no doubt, be something that BT will carefully control. However, ADSL2+ has been designed to be backward compatible with existing ADSL equipment, so there should be a seamless changeover to the new system, as far as the user is concerned. London, with its large population concentrated within 4 to 5 km of local exchanges and mostly within 2km, will probably be the first large scale deployment of ADSL2+ in the UK, although this is likely to be preceded by several small pilot trials. The 2km distance is important, as that is the present limit for achieving the maximum data rate transmission.
BT has only recently completed upgrading its telephone exchanges to make available standard ADSL service to virtually the entire UK population. So why should it start immediately on another round of upgrades? The answer is that it makes good commercial sense. With the higher transmission rates of ADSL2+, it is possible to simultaneously carry high definition TV (HDTV) broadcasts, internet data and internet phone calls over that POTS network that BT owns. The internet data transmission business has, over the past few years, become an important profit earner for BT and it expects that HDTV & VoIP (voice over the internet protocol internet telephone calls) will become just as profitable.
So, when ADSL2+ has been fully rolled out, we will be able to look forward to a future of stable internet technology, where everyone will be know what is on offer and everyone will be able to get it. Well no, not quite. There are already plans to increase transmission rates beyond those of ADSL2+. Look forward to the VDSL standard where rates increase to 30 to 100 Mbits/sec.
AT, June 05
Sony, the inventor of the 3.5 inch floppy disc, has announced that it is to stop making this product. This workhorse version of removable storage has fallen victim to the improving reliability and decreasing cost of CDs.
The floppy disc cannot compete with the cost per MB of a CD and, with the high demand for image storage devices, it is unable to deliver the required storage capacity. It is useless as a system backup storage device. Nevertheless; it has given excellent service and is far more rugged than its predecessor 5.25 and 8 inch diskettes.
There are, of course, other manufacturers of 3.5 inch floppies. However, Sony has declared the death of the medium and it will not be long before it is buried. We all need to plan for that event by transferring must-keep data to CDs. The next PC purchased is unlikely to have a 3.5 inch floppy disc drive some, corner-cutting versions dont have them now.
In the recent past, the reliability of CDs has caused concern with data becoming irretrievable after only a few months of storage (see CD Problems & CD Problems Revisited, below). However, it is now clear that, provided the user is careful with his / her labelling (the adhesive on some labels and the ink in some felt-tip pens attacked the CD) and does not live in the Amazon rainforest (fungal growth was an occasional problem), CDs will provide a useful data storage life. The length of that life depends on the type of CD used.
If long storage life is the criterion for choosing a CD, the CD-R type is obviously preferred over the CD-RW type (why pay more for the ability to overwrite data).
The cheapest CD-R discs are constructed by sandwiching a cyanine dye between a transparent polycarbonate main body and a reflective aluminium foil layer. The writing laser burns pits into the dye through the polycarbonate plastic to record the ones and zeros of the data. These pits can then be read by a low power laser which does not alter them. With this construction (recognisable by its bluish or blue/green tinge), it is claimed that a data storage life of 10 70 years is achievable.
The longest life for CD data storage is claimed for CDs which use a phthalocyanine dye with a gold reflective foil. The foil, seen through the transparent dye, gives the disc a golden colour and this also reflects the high cost. This type of CD has a claimed data storage life approaching 100 years.
Between the two extremes, there are two other types which give a better life than the cyanine / aluminium type but not as good as the phthalocyanine / gold type. These are formazan dye, gold foil type (light green colour) and metallized AZO /silver foil type (dark blue colour).
Remember to keep the CDs vertical and protected from dust.
AT May 05
One aspect of the London 2012 Olympics bid that has not been widely aired is the communications legacy that would be left behind. To properly accommodate the needs of the visiting athletes, hacks and public; it will almost certainly be necessary to upgrade the landline, cell phone and Wifi infrastructure throughout London, but especially in east & central London. Not to do so would put in jeopardy the operation of Londons businesses.
The world-wide popularity of cell phones suggests that this is the area which will need greatest work. However, by 2012, Wifi use will have expanded far beyond its present rather limited employment. Most new laptop computers have a Wifi capability and people are becoming more and more familiar with its use. Taking into consideration the fact that most of the visitors coming to expensive London will have to be fairly affluent, the demand for Wifi services can be expected to be great. Provision will have to be improved and there is a place for public libraries in the plan. Hopefully, the outrageously expensive British Library scheme (£4.50/hour for normal use and 50p/min for help) would not be considered to be the model.
AT May 05
The internet did not loom very large in the recent UK general election. However, analysis of the US Presidential election in 2004 now indicates that it could become another potent source of political comment. The BuzzMetrics and Pew Internet & American Life Project report Buzz, Blogs and Beyond: The Internet and National Discourse in the Fall of 2004" compared political blog activity with that of other media in the last stages of the US Presidential election. Influence and buzz generation (buzz generation = amount of simultaneous talk) was also compared.
The study considered 40 sites (16 conservative blogs, 16 liberal blogs and 8 general blogs). Conservative bloggers made 6,716 unique posts, the liberal bloggers wrote 7,151 entries; and neutral blogs had 4,251 unique posts between 27 September and 31 October, 2004.
The researchers allocated chatter on political message boards and forums to one of the conservative/liberal/general categories and found that there were 984,549 unique conservative posts, 947,503 liberal posts and 98,963 neutral posts in the period.
Campaign releases and official blogs generated by the two political parties were monitored and it was found that there were 955 Republican messages and 835 Democrat messages.
The US public does seem to have taken the opportunity to make its voice heard above the noise from the political parties and may well pose a future threat to the power of the media barons. If the US experience can be replicated in this country, democracy will be strengthened. The opportunity for the public to make its views heard directly, rather than via a media/government filter should act as a counter to apathy.
The danger is that the party political machines may try to highjack the new outlet and thus damage its credibility. The Labour party has been found to have used party activists to append their names to template and other biased letters to local newspapers. These supported party policy without acknowledging the party affiliation. It would be just as easy for the political parties to use this underhand technique on the internet. However, the technology exists to simply and quickly search for common phrases or other groups of words over a wide range of websites. The template nature of such internet messages would then become obvious, as would the original source. Not only would the practice quickly become public knowledge, but it would become self defeating.
More information: https://www.pewinternet.org/
AT May 05
Google is rolling out a new feature for its widely used toolbar. It calls this addition Google Compute and it is the visible part of an initiative to aid the scientific community.
The initiative is to implement in the widest possible way distributed computing a subject that has already been discussed here (see Grid Computing & Grid Computing Revisited). Google is effectively inviting users of its toolbar to participate in scientific computing projects simply by clicking an icon. It is offering two methods of participation, Standard and Conservative. The Standard mode allows Google Compute to run in the background whenever the computer is fully powered up i.e. not in stand-by mode. The Conservative mode allows Google Compute to run only when the computer user is nor actively using the computer. In addition, it is possible for the computer user to specify the amount of processor idle time which is allocated to Google Compute.
At present, the scientific project using the publics computers is the nonprofit [email protected] project at Stanford University. This is attempting to model the geometric structure of proteins a fundamental property of life. Although Google is an American company, one would hope that it would not restrict access to Google Compute to US researchers. If the computers being used are sited internationally, then the benefits should be shared internationally.
Google has attempted to address privacy issues with its Google Compute offering. However, as with its other products, there is the concern that it is a US company, subject to the laws and law enforcement agencies of that country.
AT May 05
Wikipedia, which receives charitable funding from Google and Yahoo, is quickly improving its popularity, according to the online intelligence service Hitwise. The data collected by Hitwise from ISPs and mega panels suggests that in March 2005 Wikipedia was the 33rd most popular site for visits from search engines. The site has been consistently improving its ranking during the second half of 2004 and throughout 2005.
Wikipedia is usually simply described as a free on-line encyclopaedia, but it is also popular as a source of news items. Although news items are more ephemeral than the traditional contents of an encyclopaedia, it is advantageous to have them grouped in one place. The background to the news items can then be looked-up very easily.
The open source Wikipedia is now proving to be more popular than the commercial general reference websites such as Answers.com and Encarta. Wikipedia achieved a traffic rate (market share) of 3.84% compared with 1.9% for Answers.com and 1.81% for Encarta. However, the most popular website in the overall reference category is Dictionary.com with a traffic rate of 4.46%.
AT May 05
IBM has designed a small unit which is intended to help people with hand tremors to use a computer mouse. It has granted a licence to a small UK company, Montrose Secam Ltd., to manufacture the unit. At less than £70, incl VAT, the unit is affordable by most libraries and represents a significant improvement in the service that can be offered to a segment of the community which is often ignored.
AT. Apr 05
On 24th February a US report was released which
suggested that 2005 might be the year that broadband power-line technology
(BPL) becomes a realistic third option for
computer users adding to the existing high-speed connection options of cable or DSL.
BPL operates in a similar way to that of DSL
and cable. The signal is carried
along existing power lines and emerges through any standard electric outlet into a
modem that is plugged into the socket.
The current BPL speeds are 500 kilobytes to 1 megabyte per second, although some of the world's major modem producers are developing technology allegedly capable of 100 megabytes per second.
Unsurprisingly, urban areas are expected to provide the main market for BPL.
The Washington based New Millennium Research Council report concluded that consumer and regulatory issues are by no means insurmountable.
BPL is not being ignored in Europe. The telecom standard organisation, ETSI, carried out laboratory tests and the French electricity supply company, EDF, carried out field tests on medium and low voltage networks at a site just outside Paris in November 2004. The electricity supply companies are not yet completely convinced that providing communication systems for the public is a good commercial proposition. However, they can see great benefits in using the technology to manage their networks.
AT Mar 05
MORE on BPL
At the Intel Developers Forum in San Francisco on 24 Aug 2005 - The HomePlug Powerline Alliance, the powerline communications standards group, announced that Intel Corporation, Linksys and Motorola, Inc. had become new sponsor members. Subsequently Intels Matt Theall was elected as president of the Alliance.
The inclusion of these electronics industry heavyweights in the group suggests that powerline communication systems are now being taken very seriously, at least in the USA. There appears to be only one UK member of the group Pace, the set-top box manufacturer.
AT Sep 05
A recent market survey by Janco Associates has shown that the Mozilla Foundations Firefox browser has become the second most used browser in North America with 4.48% of the market. Of course, this is a long way behind MicroSofts Internet Explorer (84.85%). However, Mozilla can claim that every user of its product made a deliberate product choice something that MicroSoft cannot claim, due to its past bundling policy.
Although the Firefox market share is miniscule compared with that of Internet Explorer, it is still impressive - Firefox was only upgraded from the beta version in November. Even more encouraging for Mozilla is the fact that Firefox is the only widely used browser which is increasing its market share.
The rapid growth in market share of Firefox has been mainly driven by the blogger community where its market share is a very respectable 35%. This suggests that there is much more growth to come for Firefox, as knowledge of it leaks out into the general population.
The also rans in the browser race are Netscape (3.03%), AOL (2.20%), MSN (0.58%) and Opera (0.34%).
MicroSoft does not have a history of tolerating competition and another study, by Gartner, concludes that the software giant will respond to protect its market position. It certainly was not slow to do so in its war against Netscape.
Ironically, the destruction of Netscape by MicroSoft is responsible for the creation of the Mozilla suite of Firefox (web browser) and Thunderbird (mail client). One of the last acts of Netscape as an independent company was to make available to the open source community the Netscape source code and Mozilla has built on this.
Version 1 of the Thunderbird email client was released in December 2004 and is a direct competitor of MicroSofts Outlook Express. In the first month after its release, two million downloads of the software were made. The advantages of Thunderbird over Outlook Express are far faster operation and an inherently cross platform product (the open source legacy).
UPDATE ON FIREFOX
A market survey published on 26 April showed that Firefox has continued to find consumer favour and has more than doubled its market share (10.28%) at the expense of Internet Explorer (83.07%), AOL (0.85%) and Netscape (0.92%).
More details: https://www.e-janco.com
The way that the internet scene is developing in the USA is of particular interest to the UK, as there is a tendency for the UK to follow the same path. A survey carried out by International Demographics found that the US growth in internet use has changed direction. Young people were the initial driving force but, in the last four years, internet growth has been dominated by people who are over 55 years old.
AT, Dec 04
Mobile phones are disruptive and annoying in a library environment and most public libraries are plastered with notices asking people to turn them off. These requests are comprehensively ignored and the nuisance goes on. Most library staff seem to have given up making an effort to enforce the rule, just as the police have opted out of enforcing the law on using cell-phones while driving.
So, the mobile phone has taken over and there is nothing that we can do about it. Well, not quite. QuinetiQ, the defence research organisation, has perfected the manufacturing process for a wallpaper that will screen out the mobile phone signals. This process prints a frequency selective metal pattern onto either flexible or rigid substrates using far fewer production stages than would be necessary at present. It is therefore very cost effective.
The next time that a library needs to be refurbished or built, it would be a good idea if this type of wall surfacing was included in the specification.
Of course, there would have to be consideration of the radio-frequency leakage effect of windows. However, the new process can be used on rigid substrates, so it is probably extendable to glass. Even if it was not, the technology of incorporating metal patterns in glass is already well established for car window heaters and adhesive, transparent screening films are available.
Just think. All those notices asking people to turn off their mobile phones could be replaced by notices telling them that they are entering a mobile phone quiet area there would probably be a queue to get in.
One side effect of screening a library from mobile phone signals is that any wireless local area network set up in that library could be made inaccessible from outside usually considered to be a good thing.
At a Developer Forum in Frankfurt recently, Tim Johnson of UK research company Point Topic suggested that broadband internet use is growing at a faster rate than mobile telephone use ever managed to achieve. He said that the present estimate is for 150 million broadband lines to be installed by the end of 2004 and that South Korea is the country leading the change to broadband use with 24% of the population having this type of internet access.
The Ofcom April 2004 broadband connection figure for the UK is 3.99 million (2.45 million DSL connections + 1.54 cable connections). Comparison with the South Korea figure is difficult, as it is not clear from the Ofcom data how many people use each broadband connection in the UK on average. However, Ofcom states that 15% of homes have broadband, so it is reasonable to suppose that the UK cannot be very far behind South Korea.
BT has frequently been cast as the villain in the broadband story. It has been accused of deliberately delaying conversion of telephone exchanges in order to obtain a commercial advantage. However much truth there is (or was) in the accusation; it is becoming irrelevant, as 84% of UK homes and businesses now have the DSL service available and this will soon (summer 2005) be improved to 99.6% of all households. Although the cable availability map overlaps the DSL one, the actual overall broadband availability will be slightly higher. Even those homes a long way from an exchange may soon be able to benefit from a broadband connection, as BT begins 500kb/s ADSL trials over 10km lines.
Next year, the telephone exchange conversion program will be virtually complete and it is obvious that the government deadline of all homes having broadband capability (if required) by 2008 will be met. It only remains for the potential users to be persuaded that high-speed internet access is worth the money. Once the infrastructure is in place and paid for, there will be an enormous pressure on the service providers to maximise the utilisation of that infrastructure. Thus, it is to be expected that the cost of a broadband contract will continue its downward slide and the services provided will be expanded further.
This view of the future is not restricted to the writer. During its recent, first Investment Analyst Conference, Ofcom stated: By the end of the decade, the successful introduction of greater infrastructure competition combined with continued innovation in access technologies, could enable a majority of UK households to benefit from affordable and accessible broadband connections delivering video-quality bandwidth [i.e.very wide bandwidth].
The question arises as to how this changing situation will affect Londons Library Services. In the short term, there probably will not be much of an impact. There will be a continuing need to provide a computer literacy education program and to ensure that internet access is not exclusively for the affluent. Lending of books, CDs etc will go on as before.
In the long run, when hardware and on-line access costs have reached a very low level, the public internet access facilities now provided by public libraries will be of small value to the community and their popularity will wane. Further, the computer literacy initiative will have worked its way through that part of the adult population reachable by such a program (the children are and will be covered by school programs). At best, there will be a small residual requirement for on-line access for visitors to the capital and a minority of residents. The Peoples Network will have done its work and will probably need to be retired.
Long term developments in the range of services provided on-line and new hardware designs are likely to have an increasing impact on the core library activities that are, at present, not seen to be related to computer use. Probably the first areas to be affected will be the CD / DVD hire activity. With fast download, 24/7 availability and a vast catalogue to choose from; the on-line businesses will have major advantages over any high street operation whether in the public or private domain. Added to this advantage is the inherent low overhead cost applicable to the on-line businesses.
It is just possible to see the beginning of this shift in market position with the pain now being experienced by traditional record shops. The more volatile part of the record industry, i.e. the pop record part, is suffering badly now, but the classical end is just as vulnerable in the long term. The record industry may well succeed in combating copyright infringement. However, this will do little to help the high street record shop the die is cast and public libraries will not be immune.
Once it is accepted that the contents of CDs can be obtained more easily and more cheaply via the internet than from a public library, it becomes clear that all associated electronic media will eventually also fall into this category. Thus the public library video tape and DVD collections, which have been so enthusiastically built up, will become wasteful space fillers from an income generation viewpoint. At some point, it will be necessary to decide whether to remove these items from the selves or to remove the charges. If removal from the shelves is the preferred option, will books be substituted?
There are many electronics development projects still in the research phase which will accelerate the trend already identified and these will eventually come to bear on all core public library activities. In the very long term, even the book lending activity will not be immune from radical change.
Speaking at the recent Public Library Authority Conference; Lord McIntosh, the Libraries Minister, stated Books are here to stay because people want them and they will be a key element in the library service in the foreseeable future. It is true that books are very popular at present, but one should ask why this is before assuming that they will continue to be so.
After 500 years of printing development, books have become relatively cheap, convenient, easily obtainable and easily used containers of information (factual and fictional) i.e. it is the currently preferred delivery method for most detailed content. If an improved delivery method becomes available, it is inevitable that it will replace the currently preferred one. The research is underway in many parts of the world in preparation for this new product type.
There is little doubt that Library Services, over the next few decades, can be expected to change out of all recognition. Interesting times?
The concept of the portable electronic book originated in 1968 and has since been slowly developed. The speed of development has been dictated by the lack of suitable electronic components. However, this situation is changing and the rate of change is accelerating.
In particular, the range and quality of displays is improving fast. It is still not possible to find a commercial, portable display which can match the resolution available from good quality printing on good paper but, after less than half a century development, it is possible to believe that this standard will be achieved in the short to medium term. This should be compared with the half a millennium required to reach this quality level by the print industry.
Late April will see the introduction of Sonys e-Book reader, LIBRIe. This utilizes the Phillips / E Ink electronic ink technology to give a monochrome display with a resolution of approximately 170 pixel per inch (similar to newsprint). Although this resolution is not high, it has certainly proved to be acceptable to millions of newspaper readers throughout the world.
The real advantage of the LIBRIe display technology is the fact that it is non-volatile i.e. it only requires to draw electrical power when changing the display contents. This mitigates the second problem associated with the electronic book concept, the poor power density available from current batteries. High power density batteries are a prime requirement for a successful e-book machine, as the reader must be light-weight and easily handled. Although a considerable research effort is underway to solve this problem, commercialisation of the results has not yet materialized. The LIBRIes power source is made up of four cheap, widely available, standard AAA batteries, which is sufficient to read 10,000 pages.
One inherent advantage of the e-book concept is the number of books that can be carried by one person a fallout from the intense development effort that was aimed at improving computer storage technology. The LIBRIe can store 500 downloaded books. The range of books commercially available is continually increasing and, with programmes such as the Gutenburg Project (see below) dedicated to increasing the availability of digital books, there is no doubt that there will be sufficient content to justify the LIBRIes storage capacity.
There have been several false starts in the e-book area and it is just possible that this new entrant will be another one. However, it is clear that each new entrant moves a step closer towards the ideal product and eventually there will be a product that is sufficiently close for the public to enthusiastically use it. If the experience of similar products is repeated, when that happens, the switch is likely to be very fast. This has profound implications for publishing, book retailing and libraries of all kinds. The mess that the recorded music business has got itself into by ignoring technical developments would pale into insignificance, if the book world followed suite.
AT April 04
One of the problems with implementing a community WiFi system is the short range obtainable with the standard antennas available for the IEEE 802.11 a/b/g specifications. This is now being addressed by Intel and other companies as they carry out development of components designed to meet the provisions of the new standard IEEE 802.16a.
WiFi was originally intended to only give communications over a relative small area such as offices or cafe hotspots. However, the potential for more universal use was quickly recognized and many community-wide schemes have been successfully set-up. These have needed careful design in order to overcome the inherent limitations of the WiFi specifications.
The new WiMax standard is intended to provide greater range and bandwidth than the existing WiFi specification. Potentially, IEEE 802.16 equipment will give transfer data rates up to 70 Mbit/sec over a range of 30 miles. This has to be compared with 11 / 22 / 54 Mbit/sec over a range of 2 miles (say). Intel has a test system operating from the roof of its Santa Clara headquarters which has received data from a distance of 12 miles.
WiMax, when it becomes available in 2005, will allow the copper / fibre optic last mile connection to be replaced with wireless for broadband internet use. This will be of great interest to ISPs (internet service providers) that are frustrated by the slow conversion of BTs exchanges to broadband use.
The designers of IEEE802.11 equipment are also busily seeking ways of improving performance. Motia's smart antenna design is claimed to extend WiFi range by up to four times and the Atheros variable bandwidth (channel bonding) technology will allow data rates to rise to 108 Mb/s.
The potential for Library Services to extend their public access internet service is obvious. With WiMax, a 24 /7 service does become economically possible.
More details: https://www.intel.com/ebusiness/pdf/wireless/intel/80216_wmax.pdf | https://www.tmcnet.com | https://www.wi-fiplanet.com/news/article.php/3301101 |
https://www.motia.com/press/pr120203.htm | https://www.atheros.com/news/adaptive.htm
AT Mar & Apr 04
A telephone poll during December 2003 by Harris showed that 69 % of US adults were internet users (compare with Offcom figures below). This total has grown by 3% in three years. The most popular activity for users was emailing, with research for school and work a poor second choice.
Least popular activities were found to include financial management, making travel arrangements and obtaining information about health and disease.
More information: https://www.harrisinteractive.com/harris_poll/index.asp?PID=433
AT, Mar 04
In January Offcom issued statistics on UK computer and internet use. The residential figures are:
59% of UK homes have a personal computer
50% of UK homes are connected to the internet
12% of UK homes are connected to broadband services
Thus, 41% of homes are without a computer. With the rapidly falling cost of computers, one would expect this figure to be a lot lower. If cost is not the sole, or even a prime, reason for the "information poor" to ignore the benefits of IT, what are the others?
Fear is perhaps one of the major reasons for the reluctance to acquire the new IT skills. The oft repeated phrase is "I am too old to learn new tricks". This response can come from people who are highly intelligent as easily as from people who have genuine learning difficulties and from people who are far from old. Of course, it is a fallacy that has been disproved many times.
Fear of computers is just one aspect of a general fear of technology present in the UK and is, at least partly, caused by the two culture syndrome. A syndrome strengthened and propagated by an education system that historically believed that numeracy (the basis for all science & technology) was an optional extra.
The role of public libraries has always been to provide a second chance for those that missed out in the education lottery. In IT, the UK libraries are following this tradition and are providing a very valuable means of escaping from the two culture sickness. The Offcom figures suggest that there is still much to be done. All those library computers are still desperately needed.
AT, Feb 04
The country's 35,000 mobile phone masts that now cause great controversy could become things of the past. They would be replaced by airships, or small pilot-less light aircraft, beaming down radio communications, if the EU funded Capanina project is successful.
The University of York is leading the 14 partner, international project to develop multi-media systems for installation in High Altitude Platforms (HAPs) - airships, balloons or small aircraft. These platforms would be positioned at an altitude of about 20km i.e. above airliner flight paths but lower than satellites. The platforms would have a good field of view over wide areas, and, it is claimed, would offer better and cheaper communications than either existing cellular phone or satellite systems. It is expected that broadband internet connection speeds would be 200 times faster than possible with wired ADSL.
Because they would operate at extremely high frequencies (47-48GHz or , outside Europe, 28-31GHz), HAPs would be able to deliver broadband internet and mobile telephone access as well as large numbers of TV and video channels. They could even replace current satellite or cable TV systems.
.Normally, such technology is proposed as the answer to the poor broadband internet provision in rural areas, but the university is also suggesting that HAPs could be of great use in urban situations. Whilst high flying airships and small aircraft appear to be fairly safe options for urban sites, the cables of tethered balloons (expected to be the first type of HAP to be deployed) would seem to pose a very high risk to the airliners which thickly populate the skies above cities such as London.
Urban populations are attractive to commercial ventures and, presumably, this is the reason for the interest in this type of site. The university has suggested that the HAP antennas could be directed at city centres during the day and at suburbs during the evening. How acceptable a part-time service would be to customers remains to be seen.
The use of high altitude antennas mitigates the problem of poor coverage which usually plagues extremely high frequency transmitters. However, it does not completely remove the problem and it may be necessary to use several/many platforms to give acceptable coverage in places like central London.
The University of York is planning to launch a spin-off company to exploit the potential this work has to offer.
Last year, the Swedish Space Corp. successfully
demonstrated a broadband link over a distance of 310 km via a balloon drifting
at a height of up to 29.7km.
Further information: https://www.york.ac.uk/admin/presspr/haps.html https://www.alvarion.com/RunTime/CorpInf_30130.asp?fuf=281&type=item
AT, Feb 04
Update on HAP Broadband:
Trials in October 2004 successfully demonstrated 120Mbit/s links from tethered balloons and in the summer of 2005 trials using untethered balloons will take place.
In the summer of 2006, the project partners
will collaborate with Japan's National Institute for Information and Communication
Technology on a global HAP trial using solar powered unmanned aircraft.
This type of aircraft has been shown to be very practicable by NASA's Helios
flying wing project. One of the organisation working with NASA on that,
Japan Stratosphere Comms, has become a HAP project partner.
Further information: https://www.capanina.org
AT, May 05
The long-term CD/DVD storage problem was originally discussed in Nov. 03. CD & DVD manufacturers have been, understandably, rather low key about the subject but, now, there is an independent source of information. This is the "Digital Preservation Program" sponsored by the US National Institute of Standards and Technology. A 50 page guide is available on the storage and handling of CDs and DVDs. For those who have only minutes to allocate to the subject rather than hours, there is also a one page summary.
AT Feb 04
The US government has just started operating a website dedicated to providing information on the latest computer viruses. Although it recommends using the websites of the anti-virus software suppliers, it does give useful advice on what to do if/when your computer is infected. This is very sensible, as the internet often becomes inaccessible when a computer has a virus. Of course, it is preferable to visit the government site before being infected
For more information on computer viruses (including
the MyDoom.B virus) visit:
AT Feb 04
All things within the IT industry develop at a very fast pace and grid computing is no exception. Since our original article on this subject was written in April 03, tools to ease the implementation of new schemes have matured and, possibly as a result of this, there is a degree of specialization creeping into the field.
The grid computing field has divided into two sectors - public and private grids. The sector differentiator is whether the grid network is closed or open.
The simplest, easiest to manage grid network is the closed, self contained or private type. Most academic and company grids fall into this category.
Public grid networks are more complex than the private type and are therefore more difficult to manage. However, they are potentially far more powerful. Because there is less control over individual computers, more redundancy has to be built into public grid networks. Also, communication speeds tend to be slower, as many of the computers will not have the best broadband access or even low speed ADSL. These difficulties are small drawbacks when very large numerical problems have to be solved.
Grid computing tools are available from Sun and from the Globus Alliance.
The Sun Grid Engine has now reached version 5.3 and is available in two versions. The small scale version is free and should be suitable for most private grid networks. The large scale version is available with technical support and, of course, must be paid for. Sun has built its Grid Engine on the Linux and Solaris operating systems.
As the name implies, the Globus Alliance is an organization which brought together many companies, US government departments, quasi-government organizations and academic institutions. Included in the list of collaborators are IBM, MicroSoft and NASA. Inevitably, the collaborators are mainly US based, but the UK is represented by the e-Science Grid Core Programme.
The Globus Toolkit is now in version 3 and has been developed via an open-source strategy, similar to that used for Linux development.
AT Jan 04
The BBC has used its extensive internet experience to set-up a new website - www.bbc.co.uk/ican. It has been working on the project for over a year and has now reached the beta release stage (development close to completion).
The new site will provide a forum to discuss local issues. Issues currently headlined include Post Office Closures, Wind Farms, Chewing Gum and Equal Rights for Fathers. Unsurprisingly, items on BBC programs are also prominent (perhaps just as interim space fillers). The new website could provide a very useful platform for debate if (when?) library closures again become fashionable.
A similar site to the new BBC one is at www. upmystreet.com. Although this site has the distinct disadvantage of being heavily encrusted with adverts, it provides useful information on a, user defined, post code area (schooling, housing, council) and includes a discussion forum.
Of course, there are many community websites run by volunteers which provide local information and discussion facilities for their areas.
AAT Dec 04
Compact discs have been around for some time. It is therefore rather surprising that two very basic problems with their operation have now surfaced.
The first problem is that occasionally they shatter and throw plastic shrapnel out of the disc drive at high velocity. There is a high probability that the disc drive with suffer damage and the computer user will not be immune, if he/she is in the wrong position at the time. It is believed that the cause of the problem is slight imperfections on the edge of the disc's hub opening. At the high speeds used by current CD drives, small nicks & cracks can rapidly propagate to produce a catastrophic failure.
The second problem is that some CDs have been found to be unreadable after a very short time (of the order of one year). Several causes for this problem have been suggested. Most of these point to possible poor storage conditions e.g. high ambient light, high temperatures. However, the adhesive on disc labels also seems to be a high probability cause of degradation and a tropical fungus is an additional suspect.
AT Nov 03
MIT, the world renown Massachusetts Institute of Technology, has published its course-work on the internet at https://ocw.mit.edu/index.html . There are 500 courses in 33 academic disciplines
There is no cost associated with the use of the course-work, apart from the telecom costs. This move can only enhance MIT's reputation for innovation and leadership. One wonders how long it will take for the Open University to follow - to truly become Open.
AT Oct 2003
The operating system running on over 95% of the world's personal computers is a version of MicroSoft's Windows program. One of the reasons for this near monopoly is the close co-operation which has existed between Intel, the dominant microprocessor manufacturer, and MicroSoft.
It now looks as though Intel has decided to hedge its bets. It has announced that its new "Vanderpool" chip will be capable of running more than one operating system at a time. This will be a direct benefit to Linux users and, thus, will not make MicroSoft happy.
Linux is beginning to chip away at the MicroSoft market and, given its very large price advantage, it can be expected to continue to grow in popularity. Some PC manufacturers are now offering Linux as the standard, inbuilt operating system for some of their products.
The slow erosion of the MicroSoft market will not have gone unnoticed by Intel and, when the Vanderpool chip reaches the marketplace in five years time, Linux can be expected to have become a significant competitor to Windows.
AT Oct 2003
In the early days of personal computing, the range of fonts available was very limited unless users obtained Adobe's PostScript set. This changed in 1991/92 when Apple and MicroSoft both incorporated TrueType fonts in their offerings. This, Apple developed, set allowed users access to a wide selection of fonts without incurring the very high costs associated with PostScript.
The situation remained unchanged until 1996. Then, arch rivals Apple and MicroSoft, agreed to develop a common set of fonts called OpenType. The results of this collaboration began to appear with Windows 2000 and, with Windows XP, it has now become almost fully implemented.
Adobe has not been standing idly by and has released its full font library in OpenType format (OTF). Since OpenType is natively supported in Windows 2000 & XP, no add-on utility is required to run the Adobe fonts on the current Windows platforms.
Independent font foundries are following the lead of MicroSoft, Apple and Adobe, so the OpenType fonts appear to becoming the accepted industry standard. The advantage of this is that the appearance of any document will not change with a change in computer / application. The design becomes fixed at the originator's monitor.
A.T. Oct 2003
The London Library Development Agency (LLDA) has been awarded £200k by the People's Network Excellence Fund for the LLDA's WILL (Whats in London's Libraries) project. This project aims to link up all the catalogues and community information databases of London's public libraries. Some museums and archives databases will also be included as a pilot.
A web interface will be developed to allow users to simultaneously search a range of resources across London. It is not intended to develop a single database.
The project is expected to be completed by September 2003.
AT, July 2003
PS @ Oct 03
You may momentarily be confused by the above link, but it is not the LLL address - just a very, very, very close approximation to it. In addition, if you happen to investigate further, you will find that the WILL logo looks extremely like the Thames diagram which has been the top of this page for several years.
London is something of a super-tanker - slow to respond and sometimes slow to seize opportunities. This appears to be what has happened with WiFi community projects. Cardiff has led the way in this area with a project, called "Arwain". This has equipped Cardiff with hotspots which give city centre dwellers free broadband access to on-line services and information. The hotspots are actually wireless local area networks (WLAN). The technology has only recently become available and, naturally, is being applied enthusiastically in the USA. One of the US projects even has a range of 30 miles and is being aimed at coastal shipping!
Arwain is part of Cardiff's assault on the digital divide and a similar project really should be included in London's strategy. The geographic spread of libraries throughout the capital make them ideal places to establish hotspots for broadband connection. A small neighbourhood library is as good as a large flagship library in this respect and, because it is often physically closer to its user community, it may even be better. Libraries are already delivering free broadband internet access to their users and this new development will simply extend that to a 24 hour, 7 days a week service.
There is a one difficulty with the WiFi community idea. It will not directly help the poorest of the poor, because it is necessary to have a computer at home to take advantage of the facility. However, this is where computer recycling comes into the picture. In the longer term, rapidly falling computer costs (a new internet ready computer now costs less than a mid-range television) will erode the disadvantage.
I am not aware of any general strategy to develop public use of the internet beyond the excellent, but restricted, facilities presently provided in libraries and community centres etc. By giving the user freedom to choose when he/she can use the internet, WiFi community projects seem to provide the necessary second step on the ladder up to full digital equality.
More information at www.arwain.net
AT, April 03
About two years ago the Search for Extra Terrestrial Intelligence (SETI) project analysed huge quantities of data from space by utilizing, via the internet, millions of home computers around the world. This was the first large scale example of grid computing.
Library Services throughout London now own
a large number of public access computers, thanks to the People's Network
initiative. For most of the day, these computers are unused. So, why not
use them for grid computing? Compared to the SETI project, coupling up
to the public access computers in libraries for a London wide grid should
be fairly straight forward and would give London's Library Services a very
valuable "overnight resource" to market to researchers. It is possible
that, after the initial expenses, this could be the way that future funding
for public access computers is obtained or it may be that it will be an
additional reason for non Local Authority money to be provided. Whatever
is the outcome, grid computing is something that London's Library Services
can benefit from.
The government certainly thinks that grid computing has possibilities. It recently joined Oxford University and IBM in a grid computing project, called "eDiamond", connected with breast cancer screening.
AT, April 03
Due to the well funded "Peoples Network" project, London's libraries are now fairly well stocked with public access computers. Although there has been some criticism of the methods used to make room for them (see side bar), these computers have proved very popular with library users. That is, they are popular with those library users who are able to operate them. For those people with a language, motor or visual impairment, the digital divide is still alive and well and is operating in a library near you. This need not be the case, as assistive technology is available to provide help.
For computer users with a language impairment, screen review utilities make on-screen information available as synthesized speech and couples the speech with a visual representation of a word e.g. by highlighting a word as it is spoken.
Mobility impairment has many forms and there are several pieces of hardware (large keyboards, small keyboards, keyboards with alternative layouts) which can be simply plugged in to a standard computer to provide some alleviation. Where mobility is severely restricted, it is possible to use a scanning option with an on-screen keyboard to allow individual keys to be highlighted and then selected by a switch positioned near a part of of the user's body over which he/she has voluntary control.
Visually impaired computer users have probably the widest range of hardware and software to choose from. These include screen enlargers, screen readers, speech recognition systems, speech synthesizers (audio output of keyboard input), refreshable Braille displays (tactile output of computer screen), Braille embossers and talking / large print word processors.
It may be too much to expect every public library to provide computer access for everyone irrespective of their level and type of disability. However, some improvement on the present situation is essential.
AT, March 03
The electrical power distribution company Scottish & Southern Energy has become the latest organisation to attempt to use the nation's power lines as a substitute for the telephone cable system.
There are over twenty other European power companies carrying out trials involving the transmission of high bit rate data over their distribution networks. In the German city of Mannheim, approximately 3000 homes are involved in the trials there. However, the Scottish & Southern trials are on a much smaller scale with only 50 customers taking part.
For over a decade companies have been trying to push digital communication through powerlines. Although several companies reached the public trial stage, none have yet been successful enough to launch a commercial operation.
Initially, the main problem that had to be overcome was the high level of interference that existed on the powerlines. This difficulty has now been largely solved but it has been, paradoxically, replaced by the problem of interference caused by the digital signals themselves. The powerlines act as antennas and it is feared that they will broadcast interference in the high frequency radio bands. This would disrupt international broadcasting, amateur radio and maritime / aircraft communication & navigation.
AT January 03
The Office of National Statistics has announced that, for the year to the end of August, internet subscriptions had increased by 11.8%. However, for the July - August period, the number of subscriptions had fallen by 0.3%.
The July - August decrease was caused by a 1.0% fall in dial-up subscriptions, but this was partly masked by a 9.9% growth in broadband connections. As the proportion of broadband subscriptions is quite low (7.2% in August), rapid growth in this type of connection has only a marginal effect on the overall figure.
The Office of National Statistics figures support the OFTEL findings on the increase in broadband connections (see article below). The growth for the year in this area has been 354.8%.
It is expected that the growth in internet subscriptions will decrease in the summer months, but a fall is unusual. Due to cost differentials, the broadband community is dominated by business users, whilst the dial-up community consists mainly of home users. Therefore, the implication of the ONS figures is that the public's interest in the internet is beginning to moderate. The winter months will determine whether, or not, this is correct. If it is, very high growth in the number of home users will only be restored when the cost of a broadband connection becomes generally affordable - the magic figure has been suggested as £9.99 / month.
AT October 02
At the beginning of May, OFTEL reported that there were 500,000 broadband subscribers in the UK (see "The All Pervasive Internet" below). Now it says that there are one million. This fast increase is presumably due to the recent decrease in cost.
However, the UK broadband total should be compared with the situation in Japan, a country with a similar sized population. There, the Ministry of Public Management, Home Affairs, Posts & Telecommunications announced that the number of Japanese broadband subscribers was 4.22 million at the end of September.
AT October 02
Citizens Online has announced the first local launch of its national project "Everybody Online". The first launch took place in the China Clay area of St Stephen in Brannel in Cornwall and was received very positively by the community as a whole.
In partnership with BT, Citizens Online will be working in selected low connectivity areas of the UK. The aims of the Everybody Online Project is to work with local partners and the local community using facilities that are already available. Citizens Online has found that in most areas of the UK, there are untapped resources that are available to fulfil the needs of local citizens. The aim of the project is to bring these together.
For further information about Everybody Online,
the St Stephen launch and other locations where they will be working, visit
their website -
AT September 02
No, not another web bookshop. This is a University of North Carolina project to give everyone direct access to a wide range of books via the web.
"The Project Gutenberg Philosophy is to make information, books and other materials available to the general public in forms a vast majority of computers, programs and people can easily read, use, quote and search" - not just the Bible and Shakespeare.
Of course, your local library also does this quite well - both via computers and real books. However, for the housebound, Project Gutenberg could be a very, very useful facility. The rural population may also find it of use. For the rest of us, it could be the answer to the problems caused by over enthusiastic weeding of library selves by the staff.
For people with a broadband internet service, reading books via this service would be cost effective as well as convenient. However, a dial-up service would make it an expensive exercise. As the cost of a broadband connection to the internet falls in the future and its use becomes widespread, Project Gutenberg should become a significant resource. Looking into the distant future, when existing laboratory development projects have emerged into the market place, this method of accessing literature could become dominant. The Project Gutenberg people just need a little patience. This is something they have in abundance - the project has been running since 1971.
There is an opportunity to contribute to the project, if you think that it a good idea.
Further information: https://www.gutenberg.net
AT September 02
As you are reading this note, you are almost certainly one of the 50% of UK adults that use the internet. Also, if you are at home, then you are one of the 46% of UK adults that logs on from there. These figures have been gleaned from an interesting Oftel report to be found at https://www.oftel.gov.uk/publications/research/2002/q8intr0402.htm.
It seems that 4% of UK adults use the internet from locations outside the home. The workplace, internet cafes and public libraries are obviously the most significant of these other locations. Access to the internet from the workplace is, on the whole, a privilege enjoyed by the middle class and internet cafes are predominately the haunt of the more affluent. However, the option of using the internet from public libraries is a right which can be exercised by all sections of the community.
Public libraries have taken on the task of ensuring that the benefits and frustrations of internet use are introduced to the widest possible audience, irrespective of their ability to pay. The Oftel report indicates that they may be succeeding, as it notes that, since August 2000, there has been a marked increase in C2DE households using the internet.
Other figures from the Oftel report are:
43% of the 10 million internet homes use unmetered access
3% of internet homes have broadband access (at the beginning of May
2002 there were 500,000 UK broadband users of all kinds)
9 hours per week is the average household on-line time
IS THE PARTY OVER ALREADY?
For over a year, the Virtual Society? Programme has been warning that the great public interest in the Internet could be beginning to wane. These words of caution, almost drowned out by high volume hype, result from a research programme being undertaken by 25 universities in Denmark, Holland, the UK and the US. This research has found that "Growing numbers of people, many of whom are teenagers, are becoming "former users" of the Internet. Such drop offs suggest that only limited sub-populations will experience saturation, and that providers will need to become skilled at selling new products to a relatively static customer base."
No doubt, the hype merchants, with high fees and salaries to protect, will dismiss such findings as unrepresentative. However, the consumer group Which? has found a similar trend among UK email users. It has discovered that the percentage of surfers choosing email as their preferred method of communication has fallen over the past year from 14% to 5%, whilst those surfers preferring face-to-face meetings has risen from 39% to 67% over the same period.
Equipment suppliers have provided further confirmation of the illusory quality of the predictions of ICT pundits with axes to grind. The user friendly internet TV and its cheaper set-top-box brother have failed to achieve significant sales. The trial blazing Bush sets are being sold off cheaply as sales are currently running at less than 40% of planned levels.
The worldwide downturn in personal computer sales may also be partly due to a stagnating Internet related market.
So, does this suggest that the information revolution has ended before it has properly begun? No, the Virtual Society? research suggests only that the rather simplistic expectations of the policy makers will not be met. The Internet is causing significant changes in personal and corporate behaviour. Not unexpectedly, the changes which are occurring are not always to the benefit of society as a whole. Careful research, such as that carried out in the Virtual Society? Programme can provide the information required to maximize the benefits and minimize the damage caused by the information revolution. However, for this to happen, policy makers have to accept that the changes which are occurring are not always benign. Sadly, if the experiences of library user groups are taken as a guide, any organization is simply ignored if it dares to suggest that much tighter ICT management is necessary by central & local government and others. The operating philosophy seems to be "Never Mind the Quality Feel the Width".
INTERNET CONNECTIONS AND PCs IN LIBRARIES
These comments are based on the way I use public internet connections and PCs. Of course I know others will want different things from the machines. I have used several library systems around the country and have found that all too often they give the impression of being designed by computer experts who do not understand what the user wants out of them, nominally controlled senior library staff who do not understand enough to dictate effectively to the IT specialists, and without asking users what they want to find. It is indeed hard to decide what the ideal public library system should include. These systems are presumably intended to attract inexperienced users who will find it hard to say what they will eventually find useful. On the other hand, computer devotees (who are in any case unlikely to make much use of public library machines) might know about the systems available, but will probably not understand how non-devotees want to use them. It would be useful if someone could draw together what we, the users, want to do with the machines, and what we expect to find in the library. I am sorry if the comments show my ignorance of the technology, but if they do I am pretty sure I have plenty of company.
Downloading to a floppy disk
Downloading matters. Library systems should allow files to be downloaded to the user's own floppy disk, but most I have found don't. Many people, like me, have home computers which are too old to be connected to the internet. Also, serious study of documents often takes more time than is allowed by the library booking system So, I want to copy internet material from library machines to a floppy disk I can use at home. For example, the "Project Gutenberg" offers an immense range of out-of-copyright literature, including much that modern libraries would be hard-put to obtain. And studying in depth official reports; the latest DCMS news release, and so on is not a thing to do in the library. Also, as long as most library systems do not have "unzippers", downloading at least allows me to "unzip" at home. Objections to downloading When I moan to library staff about the lack of downloading, I am usually told about the dreaded virus hazard. But this need not be an insuperable obstacle - Croydon library insists on virus-checking floppies, and it is always possible to allow only new floppies sold by the library (but, please, not as a petty money-making venture). Bromley has a system which requires the staff to install a floppy drive on request, (I think is needed because their machines are coin-operated and too many users put money into the floppy drive). - they do not try to interfere with the users once the drive is in place.. All the cybercafes I know let users fetch their own floppies and even to "upload" files, presumably without anything dire happening to their systems. I have also heard hints of a fear that the machines might be used to download pornography. But this problem, if it is a real one, ought to be controllable without negating an important function of the system.
The system should include an "unzipper" programme allowing "zipped" files to be put into a usable form, but most public library systems do not have this facility. This means there are serious limits on what can be read. This limitation is especially nonsensical because some documents available through the Department of Culture, Media and Sport website are "zipped". So these cannot be read on the very machines which are being installed to satisfy DCMS demands! Only those with home internet connections may know what they say. The cliches "socially excluded" and "joined up government" spring to mind.
Word-processors are needed All internet machines should have a word-processor installed on them to manipulate material captured from the internet, before printing or downloading it.. I have never found a library internet machine that has one. If I humbly grumble about its absence I am usually told something like "Oh, that machine is for the internet - If you want a word-processor you must use the machine over there" - staff just don't understand why I want both on the same machine. Downloading a Web page a floppy disk for home processing, is not the answer. This simply results in text lost among impenetrable code that has to be weeded out. Much better to "cut and paste" the internet text into a word-processor on the same machine, then save that in a form intelligible to the home computer. At the moment if I find something interesting via the library's machine I head for a cybercafe to get a usable copy to take home. If the library allows printing but not downloading a word-processor is still needed. Printing a webpage can be a vile task. Paper is wasted when the text is spread down long thin columns. Some web "pages" produce many printed pages, even when only a bit of the text is needed. Printing only "selected" text is likely to be impossible in practice (it rarely comes out in the right place on the printed page). Much better to "cut and paste" the text to a local word-processor, sort it out there and print only what is needed. Why differentiate machines? Library Authorities might be encouraged to look at why they have different PCs for different tasks (internet, CDRom player or word-processor). No-one would set up a modern office in this way; it would mean too many machines cluttering up the place. Better to have every machine capable of performing every task.
The dreaded slowcoach Some systems in public libraries are fast, Croydon is the best I have found, but others are terrible (Lewisham, Chorley when I was last there). Greenwich was fast, but is getting slower, presumably as the central system becomes more burdened.. This is more than a minor grumble. Introducing new users to the internet via a library machine with an inefficient server results in them becoming utterly confused by its hesitations and delays - people who type faster than the system can work get the impression that their commands have disappeared into the ether, try again and are rewarded with an unstoppable series of repeats or some other confusion. Such folk are likely to dismiss the internet as useless. Hardly what the DCMS says the government wants to achieve. I now encourage anyone new to the internet to have their first try at a good commercial cybercafe, where the confusion is at least due to the nature of the Web, and not the local system. Excuses excuses The common response of librarians to complaints does not help. Often it is to the effect "Oh well, you must understand that the World Wide Web does slow up when it is busy." But, modern cybercafes can keep up the speed, so it is hardly credible that a Local Authority system cannot do as well, and wrong of Local Authorities to use the half truth that the Web can slow to deny the limitation of their own system which is often the main culprit.
I accept, that Local Authorities are likely to try to exercise some kind of control over what their machines are used to look at, but systems need to be run by people with a thorough understanding of what they are stopping coupled with a genuine desire to allow the internet to be used as a gateway to the whole breadth of information it available on it. A typical example of over zealous censorship includes Chorley Lancs, where the censor (Cybersitter, I think) would not let me look up "Noise at Work Regulations 1989" - a few experiments showed it did not like the word "regulation". A grumble to the librarian produced an offer to switch the system off, but that is hardly a general solution -surely we should encourage folk wanting to find out about the laws of our land. Greenwich denies all access to Deja News newsgroups, apparently on the grounds that they are uncensored. But, some of these newsgroups are there for serious discussions and it is rather insulting to tell the users they cannot have access because they might read something naughty. To the credit of libraries, however, I have not yet come across political censorship - I have been able to access even politically incorrect sites. But we users should be alert to censorship coming in even here.
Access to internet machines
Systems open to all
A few libraries (eg Greenwich) allow only registered ticket-holders to use internet machines. This means that users who want access while moving around the country are denied access to the internet or must pay cybercafe rates (if they can find a cybercafe):- they become "socially excluded". This might make some sense to the parochially minded, but it is better for society to regard libraries collectively as a national resource. When in a strange town I look for the public library as the place where I can obtain information, and just as I can look at any reference book it has I should be able to use any available PC (unlike a loan book, I am not going to take the PC away with me).
What CDRoms can be played? Some libraries have both a loan collection of CDRoms and computers with CDRom players, but fight any suggestion that a CDRom from the loan section may be used in one of the library's CDRom machines (eg Bromley). I can see there are problems here. The loan CDRoms will probably include games, and many library users would not want to be told they cannot look up business information or use teaching programmes because all the machines are booked by people playing games. But, with firm management (including, I fear, a specification of what CD Roms may and may not be used on the library machines) this should be soluble. Once again, the usual explanation that I get from library staff - fear of a virus caught from a smuggled-in CDRom - does not convince. Good cybercafes can cope, and libraries should as well.
OPACS should be fast and simple The slowly growing number of On-line Public Access Catalogues available through the internet is welcome. But too many of them work badly. Systems which ask me to wait "a few seconds" while the system is loaded are especially annoying, particularly as they usually then go on to work slowly even when loaded. Many public lending libraries, as well as the British and Scottish National libraries show that catalogues can be fast and simple as well as easy to use. On these services we can do without flashy graphics and other slowing devices. Also, some of the catalogues provide only limited ways of searching the stock - again annoying. OPACS should be fully available to all users The systems used should allow for all those using the Internet to make full use of them - ie no matter whether their browser is Netscape, Opera, Internet Explorer, or whether they have Windows, Unix, Linux, Apple, or whatever as the operating system. A little paranoia regarding tempting offers from firms promoting particular software systems is in order here. A public service should not be seduced into using a system that will work slowly or badly unless the remote user's machine has a particular firm's operating system installed on it.
REPLACING EQUIPMENT AND SYSTEMS
Start thinking about replacing the system before it is first installed Computers and their software have a horrifyingly short life, and need to be maintained by skilled staff. The rapid changes of the past seem set to continue - indeed purveyors of systems and equipment appear to be growing more uninhibited about offering products that will only work badly with old systems. If those installing systems do not look well ahead today's shiny new system launched with much media hype will end up a cluncky slow and unreliable relic that users reject. The finance for maintenance etc should be in place when systems are installed. And OPACs etc should be kept simple and as timeless as possible.