Accessibility and ICT
Guidelines have recently been released intended to make ICT systems more usable by older and disabled people.
The areas of disability covered by the guidelines include deafness (partial & total), blindness (partial & total), physical, cognitive and language. Older people tend to suffer from one or more of these impairments to a lesser degree and these are also covered in the guidelines.
More information: https://www.tiresias.org/research/guidelines/index.htm
AT June 08
MicroSoft Stops Scanning Books
On Friday 23rd May, MicroSoft announced that it was withdrawing from the Open Content Alliance (see Open Content Alliance below) and would stop scanning books. MicroSoft was the dominant member of the Alliance and this must be a grave blow to its operations. MicroSoft is, at present, scanning books in the British Library and it is not clear if this will continue.
The other major book scanner, Google, will now be the undisputed leader in the field.
More information: https://www.theregister.co.uk/2008/05/27microsoft_google_book_monopoly/
AT May 08
Further evidence of the changing make-up of the internet population has been
published. A research report An Overlooked Audience" has been
released jointly by Focalyst and Dynamic Logic and shows that the 17 million
US citizens born before 1945 spend a total of 750,000,000 minutes per day
on the internet, i.e. on average 44 minutes per person per day. The increasing
popularity of internet use among the mature section of western populations
appears to be a firmly established trend - see The Greying of the
Internet & UKs Internet Market Matures below.
Conversely, the popularity of internet activities among the very young appears
to be declining. Perhaps it is the more solid, mundane tasks that mature
users perform on the internet that is the secret of its steady rise in popularity
in this section of the population. There is nothing ephemeral about these
|Contact family and friends||59%|
|Health and health-related information||38%|
|Exchange photos with family/friends||33%|
More information: https://www.dynamiclogic.com/na/pressroom/releases/?id=605
AT May 08
Better Use of Libraries Book Stocks
A universal cause for concern of Library Friends Groups is the state of the book stocks in their libraries. Library managers find it easier to reduce book stocks when budget cuts have to be made, rather than carry out more painful adjustments. Thus, repeated cuts are made to leave public libraries in a parlous state. The library users react to this in a quite rational way they go elsewhere for their books, if they can. This means that the middle class deserts the public libraries and the less prosperous are left with a progressively poorer range of books to borrow.
Of course, rebuilding book stocks is the only long-term solution to the problem. In the meantime, short-term measures can be taken to alleviate some of the symptoms of the illness. One of these short-term measures would be improved utilisation of the remaining stock.
It is a boast of professional librarians that they are able to point library users towards alternative authors, when those users find that the books of their favourite writers are no longer on the library shelves. I am not sure how many of the public actually use this opportunity, although I suspect that very few do. There has certainly been no attempt to publicise this redirection service in any public library that I have visited recently. This is a pity, as the users would gain immensely and, I think, library staff would find life would become more enjoyable (they would be talking about the subject in which they have a deep interest).
The danger of making a determined effort to encourage users to seek advice is that this activity may become too popular and swamp the staff. This will not be a danger for long, as there is a new, infant service which automatically does the same thing. The BookLamp internet service uses software to analyse the contents of books in a similar manner to the Music Genome Project which has been analysing music since 2000. The software records writing style, tense, perspective, action level, description level and dialog level. The user simply gives the software an example of a book that he/she has enjoyed and receives back book recommendations.
The effectiveness of the new service is dependant on the sophistication of the software and the number of books which have been scanned. At present, only a few hundred books have been scanned and this, obviously, limits potential choice. However, the number is increasing continuously.
The BookLamp scheme is a good idea, but it is probably a little too late to succeed as a stand-alone project. Both MicroSoft and Google have massive book scanning operations underway and have already scanned many thousand books. It would take very little effort for either or both to adapt their software to mimic that of BookLamp. The intense rivalry between the software giants will ensure that this quickly happens. While we wait for the definitive automatic book recommendation system, there is no reason why the old-fashioned, reliable manual system should not be used more.
AT April 08
More information: https://beta.booklamp.org/ , https://www.pandora.com
Socio-economic Barriers to Internet Usage are Disappearing.
A report on broadband access to the internet, published by the Organization for Economic Co-operation and Development, has found that some socio-economic barriers to internet usage are disappearing. Factors such as education, income, age, gender, and place of access have influenced internet usage and broadband access in the past.
Previous studies have found that better educated individuals were more likely to adopt broadband than those with less education. However, this study has now concluded that education is less of a factor affecting internet adoption
While older people may use the internet as much as younger people, the elderly are less likely to go online for shopping and entertainment. A person's retirement age appears to be more of a factor than the person's actual age, the study found. Thus, the tendency for people to retire later has increased the use of broadband among the elderly. Presumable, the higher income from continued employment and the opportunity to obtain internet access at the workplace are the determining factors.
The report has found that men are more likely than women to download software, and women are more likely to engage in health-related activities and online shopping.
There are also gender differences in where internet access occurs. Men are more likely to gain access at both home and work in many of the OECD countries. Women are more likely to access the internet from educational establishments
More information: https://www.oecd.org/dataoecd/44/11/39869349
AT March 2008
UKs Internet Market Matures
The market research organisation Nielsen Online has found that the UKs active home internet population fell in November by 1.14%. Of the 10 developed countries that the Nielsen survey covered, there were only two countries to exhibit a fall. The other non-trend country, Switzerland, restricted its fall to 0.81%. Overall, the developed worlds active domestic internet population increased by 1.96% in November, with Brazil (+8.31%) and Australia (+7.28%) showing the greatest growth.
Another survey from Nielsen has shown that, for the year Oct 06 Oct
07, the under 25 year-olds share of the UK Internet population has decreased
from 29% to 25% - a relative drop in share of 16%. During the same period,
the share made up of over 55 year-olds has increased from 16% to 19% - a relative
increase of 22%. Overall, the average age of the UK home internet population
has risen from 35.7 to 37.9 in the period. This result is very similar to that
of a survey of US internet users taken two years ago (see The
Greying of the Internet below).
The Nielsen surveys suggest that the falling UK internet usage is mainly caused by the younger part of the population becoming less active in the medium. If this is an accurate picture of the whole of the 2 24 year old group, it has profound implications for the reported falling child literacy levels in England and Scotland attributed by the National Foundation for Educational Research (NFER) to high internet usage levels by young children.
Of course, it is too soon to celebrate. It is not certain that the NFER is correct in blaming the internet for falling child literacy levels and, if it is, the fall in youngsters internet usage needs to continue in order to return the UK to its previous literacy performance.
More information: https://www.netratings.com/
AT. Dec 07
Amazon has launched its Kindle e-book reader to mixed reviews. The most repeated criticisms are that it is ugly and expensive. Of course, beauty is in the eye of the beholder and one will never get complete agreement on this aspect. However, cost is another matter. At $399 in the US, Kindle costs approximately the same as a reasonable, low-end laptop computer. Unfortunately, it does not have the versatility of the computer. The problem of the high initial purchase price is further compounded by a fairly high cost for individual books ($9.99). Amazon is certainly not trying to follow George Eastmans proven marketing strategy one used very successfully, not only by Kodak, but by the mobile phone companies virtually give away the camera/phone and make lots of money on the subsequent consumable purchases.
Amazon is a marketing driven organisation. So, it is strange that the Kindle launch should be very low-key and even stranger that Amazon should say that it does not expect large volume sales. It looks as though Amazon is simply testing the market and that it will make its serious move when it knows more about the potential.
More views: https://www.publishingnews.co.uk/pn/pno-news-display.asp?K=e2007112212422839&TAG=&CID=&PGE=&sg9t=67ef20921fbab1c4a1b9ad91edce950e
AT. Nov 07
The Future of the Book
For some time, this webpage has been propagating the view that the ubiquitous printed book is due to morph into an equally popular electronic book. Unsurprisingly, this view appears to be more widespread in the USA than on this side of the Atlantic.
In the USA there is even a think tank dedicated to investigating the evolution of the book. This organisation, the Institute for the Future of the Book, has now opened up shop in London based on the London School of Economics.
The change from ink on paper technology to electronic technology has huge implications for public libraries and the arrival of the Institute should help the expected transition to occur with a minimum of disruption. Of course, for this to happen, the Institute needs to spread its message far outside the academic world. However, the academic libraries should be able to pioneer the necessary changes more easily than the public ones.
More information: https://www.futureofthebook.org/
AT. Nov 07
The Publics Views on Technological Innovation
At the beginning of May 2006, the government (Department for Innovation, Universities & Skills) commissioned a project called the Sciencewise programme in order to explore the publics views on the wider implications of themes in science and technology that had emerged from previous strategic horizon scanning work. The project report has now been published.
The projects primary aims were:
to discover views about the issues raised by possible future directions for
science and technology, from a broad set of participants,
to inform policy and decision-making on the direction of research and the
regulation of science and technology,
and to help identify priorities for further public engagement on areas of
science and technology.
Its secondary aims were to:
widen public awareness of the role of science and technology in shaping the
future of the UK;
improve public confidence in the Governments approach to considering wider
implications of science and technology;
increase understanding of the value of public dialogue in shaping policy and
decisionmaking in science and other policy areas;
improve understanding of how to engage large numbers of people in
discussions and dialogue on science and technology-related issues, particularly
issues arising from new and emerging areas of science and technology;
strengthen coherence and collaboration among science engagement
As digital technology is beginning to have a significant effect on the operation of pubic libraries in the UK, this project is potentially helpful in predicting the reaction of the public to any proposed changes.
During the first half of 2007, a national series of public conversations was
held about new technologies and their impact on future societies. These consisted
a Deliberative Panel with a diverse group of some 30 members of
the public and invited expert speakers
Facilitated Public Events in science centres and other community
Self-managed Small Group Discussions run by community bodies
such as schools, Womens Institutes and faith groups.
The projects findings were:
There were no striking divisions of opinion by social class, gender, ethnicity or age. Broadly, people were likely to be positive, with important qualifications, about developments in science and technology that seemed to promise gains in choice, quality of life, longevity, convenience, time-saving and environmental impact. Potential impacts on social equity, freedom, privacy and human autonomy and skills were regarded with considerable suspicion or hostility. A minority felt largely anxious about the perceived domination of young people's lives by computer technologies, and the potential for a dehumanised future.
The approved aspects of technological developments are unsurprising. The perceived negative aspects appear to be associated with social engineering, political correctness & (alleged) public safety and are, presumably, reactions to policy makers past mistakes. Technology can often be used, with equal ease, to benefit or to damage the public. It is the decisions of the policy makers that determine the balance. Too often, these people make their decisions with little understanding of the technology and its potential for good or bad. If the UK is to remain a leading economy, there is no doubt that it will have to enthusiastically embrace new technology. It would be extremely helpful if the UK could also improve its policy-making machinery.
AT Sep 2007
Mobile Phone Masts Proved Innocent
IEI-EMF is defined as idiopathic environmental intolerance with attribution to electromagnetic fields. It is a condition in which an individual experiences non-specific symptoms and attributes the cause of these symptoms to exposure to electromagnetic fields. This is the type of reaction that is usually blamed on mobile phone masts or WiFi antennas and includes tiredness, tension & anxiety.
Scientists at the University of Essex have investigated the effect of mobile phone signals on 44 people who complained of these symptoms and 114 people who had not. The research team, which included psychologists, engineers and a doctor, found that switching a mobile phone mast on and off made no difference to measurements of heart rate, blood pressure & skin conductance.
The group which had previously experienced the symptoms reported that they felt less well when they knew that the mast was active. However, when neither researchers or subjects knew when the signal was present, there was no on/off difference in the number of symptoms reported. The sensitive group reported more symptoms, of greater severity than the control group but this was unaffected by whether the mast was on or off.
The researchers found that the sensitive group had a higher skin conductance. As skin conductance is a good measure of physiological response to environmental stress, they concluded that the symptoms experienced were real, even if the cause was not the mobile phone signal.
The university exposure system and testing environment was verified as accurate by the National Physical Laboratory.
Although the study did not use the WiFi frequency bands, it does tend to suggest that the fears expressed about the widespread use of WiFi are unfounded. It certainly provides no reason to halt the deployment of WiFi in public libraries. Of course, we are still left with the intriguing mystery of the true cause of the symptoms.
The study was reported in the journal Environmental Health Perspectives and
can be found at:
Subsequent to the above posting, the governments Mobile Telecommunications and Health Research (MTHR) Programme published its conclusions - The six year research programme has found no association between short term mobile phone use and brain cancer. Studies on volunteers also showed no evidence that brain function was affected by mobile phone signals or the signals used by the emergency services (TETRA). i.e. the Essex University research and MTHR research reached similar conclusions.
French company, Nemoptic, has announced the availability of a 210x297mm e-paper display. This bistable, A4 size, LCD display is less than 2mm thick. As the display is bistable, it only draws power from its supply when the image is changed.
At 200 dpi, the image resolution is similar to that of print on paper. The brightness of the image is greater than 30% (brightness is a relative measure of whiteness where black = 0% & white = 100%)
The display refresh time of less than 1 sec is only slightly longer than the time necessary to turn a page of a book, so the module can provide a practical basis for an electronic book reader.
The technology developed by the French company has been licensed to Seiko
Instruments for high volume production and appears to give a slightly higher
resolution than the Philips technology being used by currently available e-books
(see Electronic Books below). However, the main
advantage is the greater display area. One possible disadvantage, when compared
newer display modules is
the lack of flexibility of the Nemoptic device (it has thin glass sheets within
it, whereas the Philips display uses a flexible metal foil).
AT July 07
Laptop Computers Without Hard Drives
There are persistent rumours circulating that APPLE may, later this year, introduce notebook computers without hard disks. The replacement of the hard-disk drives by flash memory will have the advantages of removing lengthy start-up times and will reduce weight. There is also the possible benefit of improved reliability and battery operating times.
The laptop computers would use the same type of fast memory as music players and digital cameras. Their memory size has been increasing and the component cost has been falling rapidly although, for large memory sizes, hard drives still have a small price advantage.
AT. June 07
The CRT Display Lives on, in Disguise
In December 2006, Toshiba announced that it was on track to mass-produce SED TV sets by 2008. Production will be in cooperation with Canon, which has been working on the technology since 1986. Small-scale output will commence in 2007, but when the inevitable computer monitor derivatives will become available is not yet clear.
Each Pixel of a SED (surface-conduction electron-emitter display) is an individual device which produces electrons to excite a surface phosphor coating, in a similar way to a CRT. Each device consists of a thin slit across which electrons tunnel when excited by moderate voltages (tens of volts). When the electrons cross electric poles across the thin slit, some are accelerated toward the display surface by a high voltage gradient between the display panel and the surface conduction electron emitter. The energy carried by the electrons is given up to the phosphor on impact to produce the pixel.
The claimed advantages
for SEDs are:
Very high contrast ratios (50,000:1 to 100,000:1)
Exceptional 1ms response time
180 degree viewing angle.
Brightness of 450 NITS (Candela per meter square)
Municipal WiFi Progress
Singapore is the latest world-class city (actually, it is a nation with a population of 4.6m people) to commit itself to providing a wide coverage WiFi service for its citizens. The estimated set-up cost of the system is S$100m (£33.2m), with S$30m of this being provided by the Singapore government. The remaining S$70m will be supplied by three telecom operators, SingTel, iCell & Qmax. Internet access will be available in most public areas at speeds of up to 512kbps. This is rather slow broadband but it will be free for at least two years from the start date of September 2007.
The stated aims of the new development are to broaden the opportunities for all segments of the population to access and benefit from technology, together with the determination to create digital opportunities for all Singaporeans and never allow a digital divide in our society. Worthy goals that London should take heed of self congratulation for the success of the Peoples Network is now beginning to look a little self indulgent, as the UK digital divide reopens in the WiFi access area.
To ensure that the poor of Singapore do actually benefit from the WiFi network, the government will offer 10,000 subsidised computers to low-income families with school age families.
Of course, local authorities are not just concerned about the plight of the poor. They provide services for all and to do that they have to invest in a wide range of assets. Many of these assets are mobile, e.g. cars, vans, lorries & buses, and tracking them is a difficult problem. Nortel, a large equipment supplier to internet service providers, has announced what it considers to be the answer to the problem.
Nortel has integrated active RFID tags with GPS receivers. The tags communicate with a tag reader which then uses the municipal WiFi network to report the position of the asset. Such a system does not have to be limited to tracking assets which are inherently mobile. A normally fixed, high value asset which suddenly becomes mobile is probably being stolen and its whereabouts is of great interest to the police.
More information: https://www.ida.gov.sg/home/index.aspx
AT, Dec 06
Changing Cost of Broadband Services
A report issued by Point Topic at the beginning of December showed that the worldwide cost of broadband delivered by optical fibre has fallen below that of cable.
The report is based on research carried out by Point Topic up to the end of September and found that the rate of decline in cost of DSL and cable delivered broadband has slowed slightly, wheras the rate of decline for fibre delivered broadband has accelerated. The result of these changes is that cable is now the most expensive and fibre is the second most expensive. The cost of DSL services, carried over old fashioned telephone lines, is still significantly below the competing methods, but fibre is rapidly closing the gap.
Currently, the purchasing power parity* monthly rental costs are:
Optical fibre $28.1
*purchasing power parity is a method of adjusting currency exchange rates in order to equalise the purchasing power of the currencies.
Coincidently, or possibly
not coincidently, BT has announced that it has recently installed 2,300 km
optical fibre as part of its IP based network transformation.
More information: https://www.point-topic.com/content/dslanalysis/061206benchmark.htm
The Financial Effects of Implementing RFID Technology
For some time, academic and public libraries in the developed world have been implementing rfid technology to manage their stocks (see RFID below). Last month, the public libraries of the city of Hamburg became the latest to start introducing it.
The advantages of rfid systems are obvious but it has been rather difficult to obtain reliable, quantified information on how these advantages affect overall performance. The Dutch bookshop chain Boekhandels Groep Nederland (BGN) has now published the results of a trial that it has carried out in one of its shops
BGN tagged the stock of its Almere outlet from April 06 and found that, in the following 6 months, its sales increased by 12%. This improvement is attributed to the ease with which customers were able to locate books.
The store has also cut back its inventory time for each box of books, from four minutes to a few seconds. Misplaced books are also found faster. If these cost reductions were replicated in all 42 BGN stores, handling 7 million books per year, it is estimated that $3.8 million per year would be saved.
The increased turnover and reduced costs have resulted in plans to introduce the technology in a further 16 BGN stores.
AT, Nov 06
More information: https://www.rfidlowdown.com/libraries/index.html
Internet Explorer 7
Microsoft Corp. released Internet Explorer 7 on 18th Oct. This is the first major upgrade to its Web browser since 2001 and is designed to counter the inroads that Mozillas Firfox product has been making in a market segment dominated by Microsoft (See Browser Wars & Mozilla Gaining Ground below).
Firefox features such as an integrated search window (allowing users to carry out a Web query without opening another page), tab browsing (allowing toggling between different sites) and a pop-up window blocker have now been incorporated in IE7. Unsurprisingly, IE7 uses Microsofts Windows Live as the default search engine, whereas Firefox uses Google as the default.
One of the advantages claimed for IE7 is the ability to restore work after a browser/PC crash.
Microsoft now has a product which can compete very well with the current Firefox browser. However, Mozilla plans to release an upgraded browser, Firefox 2, within the next few weeks.
A recent survey by OneStat.com
has found that Internet Explorer has a 86% global share and Mozilla Firefox
11.5%. Within the very important U.S. browser market, 80.77% of users surf
the Web on IE and 14.88% on Firefox. Firefox is most popular in Germany (33.4%);
Australia (25.5%); and Italy (21.6%).
IE 7 is available immediately
to Windows XP users and it will eventually serve as the default browser for
Microsoft's much-heralded Windows Vista operating system, due to be released
to consumers in early 2007.
IE7 can be downloaded from www.microsoft.com/ie.
Browser market survey is available from www.onestat.com
AT, Oct 06
European Digital Library
The European Commission has called on Member States to contribute to the European digital library by setting up large-scale digitisation facilities to accelerate the process of getting Europe's cultural heritage on line via the European Digital Library.
At present, only a small fraction of the cultural collections in the Member States is digitised but, by 2008, it is planned that two million books, films, photographs, manuscripts, and other cultural works will be accessible through the Digital Library. It is expected that his figure will grow to at least six million by 2010 and will ultimately be much higher, as potentially every library, archive and museum in Europe will be able to link its digital content to the new library.
Obviously, the input from the UKï¿œs public library services will be limited. However, the academic libraries have an enormous amount of suitable material and a good start has been made in digitising this. It is to be hoped that the Google / MicroSoft initiatives in this area will be complementary rather than competitive (see: Open Content Alliance & Google Flexes Muscles ).
The European Library portal is at: https://www.theeuropeanlibrary.org/portal/index.htm
Further information: https://europa.eu/rapid/pressReleasesAction.do?reference=IP/06/1124&format;=HTML&aged;=0&language;=EN&guiLanguage;=fr
AT Sep. 2006
The Bloomberg New York City administration is slowly becoming more receptive to the concept of providing WiFi access for the public. Unlike other large US cities (e.g. San Francisco, Pittsburgh and Sacramento) where free access is becoming the norm, NY City has been reluctant to do more than pontificate about the supremacy of private enterprise in the USA and it has been left to the students of Monroe College and local community groups to set up hot spots in the city.
Now, the city no longer expects to make money from any WiFi ventures in its jurisdiction and, by the end of the summer, ten of its parks will have WiFi access provided (eight locations within Central Park). In addition, squares in Manhattan, the Bronx and Queens will be covered.
A Senator representing New York State, Chuck Schumer, has stated that he will introduce a bill to provide US federal funds to communities wishing to install WiFi systems citywide or countywide.
Reuters reports that the mayor of Paris is planning to install 400 WiFi access points in the citys public areas such as parks and libraries. Some free access is expected when the system goes live next year.
Pity poor London.
The semiconductor arm of Philips, the multinational electronics group, has announced the launch of its next generation of rfid (radio frequency identification) chips. The company, the worlds largest producer of rfid chips, is targeting the library market for its new product via collaborating hardware manufacturers and system integrators.
Rfid technology has been around for some time but has only made a small impact on the library world (there are only a few hundred libraries using it, worldwide). Probably, the reason for this is the fairly high cost. However, costs are falling as the retail market increasingly adopts the technology. The retail market segment will need very many million items per year and economies of scale will result. The standardization of library system technology that is both ISO15693 and ISO 18000-3 compliant will also help in rfid take-up in libraries.
North American libraries are leading the adoption of rfid technology, but the Netherlands has recently decided that its libraries would also utilise it. UK libraries seem to be taking a wait and see stance.
For hard-pressed UK local authorities, a reduction in the cost of book issuing is potentially the most attractive aspect of rfid technology. A borrower does not have to struggle with a bar code reader to take out /return a book via a self-service terminal. So these terminals become much more user-friendly and can be deployed on a wider scale. Indeed, the 2 metre range of the new devices suggests that dedicated terminals may not be necessary anyway. Grouped library catalogue computers could possibly be used or even Peoples Network computers.
One of the most frustrating aspects of trying to use a public library as a reference source is the mismatch between the library catalogue and the actual stock on the libraries shelves. Although staff do periodically check that books which have not been issued for a given duration are actually on the shelves, this is only a partial remedy and it is quite time consuming (costly). With rfid, proper stock-taking can be quickly accomplished and, in theory, could be undertaken each day to ensure efficient stock management, at almost zero cost. The remote searching of on-line catalogues would thus become far more useful.
AT, June 06
The alpha 3 release of Mozillas Firefox 2 browser is expected to become available on 25th May. As an alpha release, it is still in the development phase and is still some distance away from a final release.
The current full release of Firefox (version 1.5) has been nibbling away at the market dominance of version 6 of MicroSofts Internet Explorer (see Mozilla Gaining Ground below) and MicroSoft has been working hard to bring out an upgrade to its offering to counter this (IE7 is now at betta 2 release). Both IE7 and Firefox 2 are believed to have final release dates at the end of the year.
The more seamless operation with the Windows operating system that MicroSoft can offer with its browser product is a huge advantage, so Mozilla has to offer other advantages to offset this. MicroSofts position has been further strengthened in this respect by Apple computers now running the Windows system.
AT May 06
Nicholas Negroponte, a professor at the MIT, has been developing the concept of low cost computers for third world countries for some years and reported project progress at the recent LinuxWorld event.
The Negoponte led one laptop per child (OLPC) initiative has obtained $29 million in funding for engineering. It is expected that the project launch will be in 2007 with shipments of between 5 million & 10 million units initially in China, India, Thailand, Egypt, Nigeria, Brazil and Argentina.
The key to achieving the target $100 price for the laptop is the huge volumes needed up to 100 million per year. Initially, the computer will cost $135, although it is expected that this will fall to hit the $100 target by 2008. The cost may even drop as low as $50 in 2010.
Negroponte has pointed out that 50 percent of the cost of a normal notebook is attributed to sales, marketing and distribution costs. OLPC claims that it has no such costs. In addition, OLPC expects to avoid the 25 percent of total cost contributed by a Windows operating system license by using Linux instead. The remaining 25 percent of the total normal cost is that of the display. OLPC will reduce this by utilizing a dual-mode display that is both reflective and transmissive i.e. reducing the backlighting requirement.
The $100 laptop computer will have a 500MHz AMD x86 processor and 128MB of DRAM & 512MB of Flash memory. It will have a power drain of less than 2 watts, which is expected to be generated by windup power, similar to the successful radio designed for the third world market. Wi-Fi connectivity will be provided and there will be three or four USB ports.
Although the $100 laptop does not appear to be very attractive for the more developed markets such as the UK, the enormous component volumes that will be required for its manufacture will have a great influence on the design of computers intended for those markets and will, inevitably, result in further price erosion there.
There has been much high volume comment from, and inspired by, large US telecomms. companies about the supposed unsustainability of the business model used by schemes offering free wireless broadband internet access to city communities. Whilst these incumbent service providers heavily criticize the pioneering municipal WiFi/WiMax schemes, they are quietly beginning to follow the same strategy themselves. Sprint is now setting up two free wireless internet access zones in a suburb of Las Vegas to test the idea before rolling it out. Verizon and Time Warner are also companies that have had an unpublicised change of heart. Of course, this about turn has nothing to do with the fact that large, non-telecoms companies (such as IBM & Google) are showing intense interest in the sector, or has it?
AT, Mar 06
Six bids for the San Francisco citywide WiFi system were received by the city on 21st February. As predicted, Google was one of the bidders (see Google Flexes Muscles ). The search engine company has teamed up with EarthLink, the well-known ISP, to propose a two level system. Google would offer a free, slow service (256 to 384 Kbps), whilst the Earthlink service would be fee based but faster (1Mbps in both directions). The companies would share the cost of deployment and operation.
The powerful SF Metro Connect alliance (IBM and Cisco Systems, together with SeaKay), has also submitted a bid for the San Francisco contract. It is believed that this also has free access provision. Other proposals have been submitted by Communication Bridge Global, MetroFi, NextWLAN and Razortooth Communications.
San Francisco is not the first US city to launch a citywide WiFi project. Philadelphia, Anaheim (both EarthLink projects) and some other major cities have also done so and it is expected that about 25 major US cities will have been unwired in the next two years. Although these are largish cities, they are generally far smaller than London (San Francisco has a population of less than 1 million people).
New York City is closer in size to London, with a population of just over 8 million people and has an enthusiastic group of Councillors advocating the construction of a citywide WiFi system. However, the free marketer Republican mayor, Michael Bloomberg, believes that a private sector group should set up a system, if it is needed. He has, therefore, been placing obstacles in the way of efforts to implement a community run operation. Discussions have been going on for over three years and the only progress has been the introduction of a City Council Bill to set up a task force to investigate the matter.
Bloombergs hope of a private sector WiFi solution for NY City has not produced any firm response from companies. EarthLink has stated that it was theoretically interested, but would prefer to have the city as a partner.
Not all influential people in the USA agree with the stance of the NY Mayor. Two Bills were recently introduced into the US Senate to allocate more of the available RF spectrum to community use.
Whilst the NY citywide WiFi scheme may only be crawling slowly forward, a London scheme has yet to even get to the public discussion stage. There is a great danger that the enormous opportunity presented by the 2012 Olympic Games for setting up a London-wide WiFi/WiMax system will be lost. The advertising possibilities open to any organisation, such as Google, willing to put such a system into London by 2012 would more than compensate for the large initial cost. Thus, London is in a good position to get a very good deal, but only if it moves in time.
America Online and Yahoo have announced that they intend to introduce a preferential email service. They intend to offer privileged treatment to companies which are prepared to pay for the delivery of email messages.
The internet companies allege that the preferential service will help them identify legitimate mail and cut down on junk mail. Presumably, all unpaid emails will be defined as junk mail. This perception is reinforced by the intention of AOL to allow preferential emails to bypass its spam filters and to delete links & images from most unpaid messages.
The new proposals seem to offer very little advantage to bulk email senders (the prime target) over the far cheaper, fixed fee services already provided by other companies.
Of course, one suspects that the new proposals are all about the huge additional revenue AOL and Yahoo will generate if they succeed in their ploy.
The principle of net neutrality states that network owners should remain neutral with respect to the content they carry. There has not been any outright dissent from this principle. However, AOL and Yahoo seem determined to ignore it. Additionally, executives at three of the largest US network operators (BellSouth, AT&T and Verizon Communications) have also suggested that large content providers should pay extra for priority use of their networks. The idea that service quality levels should only be guaranteed at a price has caused the US Consumers Union, the Consumer Federation of America and Free Press to call on Congress to enact net neutrality legislation. The US Senate has started taking evidence and a Google executive has argued forcefully against the suggested changes to current practice.
If the companies
proposing the changes are successful, it is to be expected that other internet
companies will then follow suit and the free services currently available
will become progressively worse and worse until they are finally discontinued.
There is no doubt that a great blow will have been delivered to the vision
of the open world wide web which was generally accepted such a short time
This year will see the beginning of the end for MicroSofts Windows XP operating system. On the last day of 2006, MicroSoft will stop supporting WinXP Home - it says. Support for WinXP Pro, the more expensive version, will continue until 2011.
There is some doubt about whether MicroSoft will actually carry out its published program, as it has failed to live up to past, similar declarations concerning Win98. After several stays of execution, MicroSoft now intends to stop issuing new patches, updates and fixes for Win98 on 30/06/06 and to remove the self help pages from its website one year later. It probably means what it says about support for Win98 this time.
After a life of eight or nine years, it is reasonable to retire Win98. However, the original retirement dates were obviously determined by MicroSofts income requirements rather than any great market need.
The retirement dates of WinXP are again being driven by cash generation considerations the release of Windows Vista (formerly Longhorn) is expected to be announced in the autumn, with availability aimed at Christmas 2006. The major specification improvements promised for the new version are greatly improved security and, for gamers, 64 bit operation.
Providing that the applications running on an operating system are performing well, there is no great urgency to upgrade. Most current application software still supports Win98 and, of course, all of it that is aimed at the Windows system (rather than the Mac or Linux systems) supports WinXP. However, it is not absolutely necessary to ever upgrade Windows, one could simply change to Linux, at no cost. Free versions of the Linux, such as Mandrake and Centos (Community ENTerprise Operating System) can be downloaded. These are based on the Red Hat Linux source code - quite legally, as Red Hat is released under the standard Linux GNU Public Licence and they are regularly upgraded, just like other operating system software. If the Linux operating system is coupled with Open Office , Mozilla's Firefox browser and Grisoft's AVG antivirus package, 95% of software for a good home PC can be obtained free. There may be a very small cost for a business PC, but it would be insignificant compared with continuing to use Windows.
few days after the above article was posted on the LLL website, MicroSoft
the 31 Dec 2006 date from the XP support timelines. The MicroSoft website now
states that mainstream support will end two years after the next version of
Windows is released. This should now push support for WinXP out to the end of 2008 / beginning of 2009, as a minimum.
report from the European Interactive Advertising Association was issued
at the end of 2005 which compared the popularity of the different types
of media on offer in Europe.
The report found that the European usage of online media grew by 17% in 2005 compared with TV at +6%, radio at +14%, newspapers at +13% and magazines at -7%.
France was found to lead the league table of internet usage with the average consumer spending 13 hours per week online. The UK and Spain were placed second in the table with the average consumer spending 11 hours per week online. Italians were the least enthusiastic internet users with an average of 8 hours per week online.
One surprising finding in the report was that already 10% of Europeans use VoIP (Voice Over Internet Protocol) for their telephone calls. This rapid take-up of the technology must be causing concern in BT and the other European wired & wireless telecom. companies.
More information: https://www.eiaa.net
AT, Jan 06
Microsoft has joined the Open Content Alliance (OCA) and Yahoo, to become a rival to Google (see Google Flexes Muscles ) in the scramble to digitise the world's libraries. As result of the collaboration, Microsoft will launch MSN Book Search next year. Other contributors to the OCA initiative include Adobe Systems, Harvard University, The Royal Botanical Gardens at Kew, the Smithsonian Institution, the UK National Archives and York University.
MSN Book Search will also deliver results from academic libraries, periodicals and other print sources, putting the service in direct competition with Google.
One of the first significant outcomes from the Microsoft initiative is an agreement with the British Library to digitise its out-of-copyright collection and to provide search facilities for it. During 2006, it is expected that 25 Million pages (100,000 books) will become available on the internet from the British Library. This is only a small part of the British Library collection, but the project is expected to continue beyond 2006.
Of course, the Gutenberg project (see Books on Line ) has been making available on the internet out-of-copyright books for many years and public spirited authors have been increasingly bypassing the publishing houses by simply uploading their books onto websites. The new developments bring to bear the enormous resources of Google & Microsoft and this should ensure that progress will very rapidly accelerate.
AT Dec 05
2.0 Has Arrived - Free!
OpenOffice 2.0 final was released at the end of October. By default, it uses a vendor-independent format to save documents - utilising the open standard OASIS OpenDocument XML format. This file format is recommended by the European Commission.
As a direct competitor to Microsofts Office suite, OpenOffice has the overwhelming advantage of being FREE to download. In common with Microsofts offering, the OpenOffice package is large (75MB). Downloading via a broadband connection is therefore a practical proposition but doing so via a dialup connection would probably be a little difficult. Very cheap CDs of OpenOffice are often available and provide an alternative to downloading.
Microsoft has a near monopoly position in general office software used in library services and this is an unhealthy situation. Should not OpenOffice be seriously considered at the next upgrade?
Choosing Microsoft is the easy option it does not require any thought. However, this used to be the situation with IBM computers and now they are just one of the many that are in the market place. Why should the situation be different for software?
Google is stirring up hostility from the publishing world by extending its search technology to the world of books. Its new Google Print venture is intended to do for the print medium what it has done for the internet i.e. to point the user to the source of information in print. This idea was originally received by academic librarians with some enthusiasm (Oxford, Stanford, Harvard & Michigan universities have agreed to open their collections to Google), as the resources that Google can employ are enormous and this would obviously be, potentially, a great help in research tasks. The New York Public Library has also opened its collection to Google.
Books in English were the first Google Print target and it has already run into opposition from the (hidebound?) book establishment. The US Guild of Authors and three authors are suing Google for breach of copyright. Google believes that it has no case to answer, as what it is doing comes within the fair use definition, and it is pressing ahead with its program by expanding beyond the US / UK sources. Oxford University and the N.Y. Public Library have avoided the wrath of US authors by limiting Googles access to documents that are out of copyright.
It is difficult to see how Googles activities can be other than an advantage to the vast majority of genuine authors i.e. authors who have created original material. However, there are a small number of reference book compilers who may be able to show that scanning their works is a breach of copyright.
By embarking on a legal fight with Google over the interpretation of copyright law, authors could be opening Pandoras box. Copyright laws are very much more generous to authors than patent laws are to inventors and there is a school of thought that suggests there should be relaxation of the copyright rules.
A high profile copyright legal fight may well open up the discussion. One of the leading critics of copyright law, Lawrence Lessig, a professor at the Stanford Law School, has suggested that one cannot accept the activities of the various internet search engines as being legal without accepting that similar scanning is also legal for print media.
Alternatively, the converse argument may be true. If the scanning of books is illegal for the purposes of providing source information, the traditional activities of internet search engines are probably also illegal. It looks as though the action by the US authors strikes at the very roots of the internet.
Having upset the US Guild of Authors, Google then went on to join up with Sun Microsystems to create a software business that could prove a great threat to the worldwide domination of Microsoft. Microsofts Office suite market has been suggested as a likely target for the new group. It is some time since Microsoft has had a strong competitor, so it too is likely to be a little miffed.
Google has proposed setting up a free, citywide WiFi system in San Francisco.
This will upset yet another set of business people. Google will use its
usual advertising method of funding for the new venture. Could this be
the way forward for London?
Londoners are to be offered an ADSL2+ internet access service in October. This type of service will provide broadband access speeds up to 18Mb/s+. The new offering will not come from BT, which has been testing a similar scheme, but from Be and UK Online. The cost of the new service is expected to be between £20 and £30 per month.
BT is concentrating on rolling out its 8Mb/s service across the country and did not expect a full UK roll-out of ADSL2+ for some time.
Another new service for Londoners, which started operation this month, is a broadband wireless one from UK Broadband. The speed range of this service is 256kb/s (cost £10/month) to 1Mb/s (cost £18/month).
The IFA tradeshow in Berlin during the first week of September was the showcase for two new developments which could prove to be significant milestones on the road to practical, mass-market electronic books.
The first development is a 8GB 1 inch hard drive which provides a storage density of 105 GB per square inch. This Hitachi device is also claimed to consume 40% less power than its predecessor and to be highly shock resistant.
The 1 inch Hitachi drive is 5 mm in height and 13 grams in weight. So, it is very suitable for electronic products intended to be carried in pockets. The low power requirement and good shock resistance are also considerable advantages for this type of application.
The next size up from the 1 inch drive is the 1.8 inch format. The increase in diameter (the height is still 5mm) has enabled Hitachi to make a 30GB drive available now and to promise a 60GB drive in early 2006.
These new drives would provide sufficient storage capacity for a small to medium library of books, providing that there are only a limited number of photographs present in those books.
The second development is a flexible display called the Readius. This has been produced by Polymer Vision, a spin out from Philips in Eindhoven.
When not in use, the screen is rolled up and the overall size of the product shrinks to 100mm x 60mm x 20mm, approximately 1/3rd of the size of an average paperback book. However, when required for use, the device unwinds to provide a 4.8 inch (122mm) paper-like display.
The display is monochrome and uses the bistable E-Ink technology. The lack of colour is not really a great drawback for text, but the bistable nature of the display is a great advantage, as it would allow small batteries to be used. This is the same technology that is used in Sony's e-book reader, LIBRIe, which was launched approximarely 18 months ago (see EBOOKS )
Mass availability of electronic book readers, based on these and similar developments are still a few years away, but it is only a few years. When people have the capability of carrying around hundreds of books in their pockets, what will be the function of public libraries?
AT Sep 05
The first report, TCO for Application Servers: Comparing Linux with Windows and Solaris, states that the TCO of a Linux application server over three years is $40,149 compared with $67,559 for Windows and $86,478 for Solaris. These totals include costs associated with transferability of administrator skill and hardware architecture portability, in addition to the immediate installation costs.
Other benefits that Linux bestows are the flexible licensing model of Linux and its compatibility with a broad range of hardware platforms.
The second report, Beyond TCO: The Unanticipated Second Stage Benefits Of Linux, considers the skill based benefits of using Linux. IBM suggests that Linux is very popular with IT staff and especially with young IT staff. The reason for this is the great support for Linux in universities and colleges. Thus, graduates leave university with a good knowledge of Linux and recruiting them tends to be easier for organisations employing the system. Their managers also find the flexibility of the open source model an advantage.
Further information: https://www-1.ibm.com/linux/competitive/solarisToLinux.shtml
LAPTOPS ARE FOR
HOLIDAYS AS WELL
Tobacco, alcohol, drugs and now emailing have been found to be addictive. At least that appears to be the conclusion to be drawn from two recent US surveys.
An Intel sponsored survey carried out by Harris Interactive has found that 51% of respondents took their laptop computers on holiday with them. The vast majority of these people used their computers to send/receive personal emails and a rather sad 43% sent/received work related emails.
A second survey by America Online has found that 60% of US holidaymakers use their laptops to check their emails (13% for business purposes and 47% for personal reasons).
Running games, films and music were also uses found by America Online to be popular holiday time uses for laptops (56% of respondents).
Only 45% of the AOL respondents used their laptops to obtain travel information whilst travelling and 30% used them to edit holiday photographs.
The fact that a large proportion of US citizens refuse to be parted from their laptops because they cannot bear to be away from their email intrays has interesting implications for those public libraries in central London that are considering setting up WiFi zones.
AT, July 05
Public library internet provision is sometimes criticized as being predominately used for email. Whilst there is nothing wrong with this use, a recent survey suggests that the popular view may become outdated, if it is not now. The Pew Internet & American Life Project, 2005 has found that adult internet activities can be broken down as follows:
Research into products and services 78%
Read news items 72%
Look up DIY information 55%
Bid in online auctions 24%
Visit chat rooms 17%
Of course, this is an American survey and will not translate exactly to the UK. However, the UK usually follows where the US leads, so the survey can be used as an indication of the near future in this country. As the three most popular internet activities are surely also those which would be at the top of the list in any public reference library in the UK, the survey results should not be a great surprise. Perhaps there is something for the supporters of public library reference collections to worry about though.
The search for DIY information was found to go well beyond the expected home improvement and gardening themes. The advantage of the internet as a very broad source of information is clearly an important factor here.
More information: https://www.pewinternet.org/
AT, June 05
There are two developments slowly creeping up on internet users. They have both been around, in theory, for some time and are now becoming practical propositions. They each, of course, have the obligatory acronym without which it would be impossible for the internet community to take them seriously - IPv6 and ADSL2+.
Internet Protocol version 4, IPv4, has been in existence for a very, very long time. It provides the method of defining a unique address for the huge number of internet users that exist worldwide and now is being superseded by IPv6. The main reason for the change is that there are simply too many hosts on the internet and IPv6 provides an increase from 4 to 16 bytes (32 to 128 bits) for use in the address.
The migration of dial-up internet users to wideband internet use has resulted in these users changing from temporary holders of IP address numbers (automatically allocated by their internet service provider) to permanent holders of an IP address. The efficiency with which the available addresses are used has, therefore, declined at the same time as the overall use of the internet is increasing. Without the changeover to IPv6, this would be a recipe for disaster.
It may seem rather strange that the Internet Protocol version following version 4 should be version 6. Just as telephone system planners did not realise that they would eventually run out of telephone numbers, the internet planners were slow to understand that the same could happen on the internet. Thus, the IPv5 designation was allocated to a parallel development called the Internet Streaming Protocol before the potential problem was recognised.
With the introduction of IPv6, the internet user will not normally notice any difference in operation when he/she goes online. At least, that is the hope/expectation. However, there have been some problems reported. For instance, Microsofts Internet Explorer has been found to be unable to resolve domain names (the www.xxxx name) to IPv6 addresses for some servers.
Whilst the objective of the IPv6 change is simply to keep the internet operational, the objective of ADSL2+ is to improve performance. Most people who have a broadband internet connection actually utilize ADSL (asynchronous digital subscriber line) technology and it allows a download rate of 1 or 2 Mbit/sec & an upload rate of a few hundred kbit/sec to be achieved a significant improvement on the speeds available with a dial-up connection. That this transmission rate is fairly reliably accomplished on a basic POTS (plain old telephone system) copper network of the order of 100 years old (in London) is astounding. That there should be plans to accelerate the transmission rates by a factor of 10 to 20 seems unbelievable. However, the International Telecommunications Union (the international body overseeing telecommunications standards) approved the ADSL2+ standard in Jan 2003 to do just that.
The infrastructure upgrade necessary to meet the ADSL2+ standard will, no doubt, be something that BT will carefully control. However, ADSL2+ has been designed to be backward compatible with existing ADSL equipment, so there should be a seamless changeover to the new system, as far as the user is concerned. London, with its large population concentrated within 4 to 5 km of local exchanges and mostly within 2km, will probably be the first large scale deployment of ADSL2+ in the UK, although this is likely to be preceded by several small pilot trials. The 2km distance is important, as that is the present limit for achieving the maximum data rate transmission.
BT has only recently completed upgrading its telephone exchanges to make available standard ADSL service to virtually the entire UK population. So why should it start immediately on another round of upgrades? The answer is that it makes good commercial sense. With the higher transmission rates of ADSL2+, it is possible to simultaneously carry high definition TV (HDTV) broadcasts, internet data and internet phone calls over that POTS network that BT owns. The internet data transmission business has, over the past few years, become an important profit earner for BT and it expects that HDTV & VoIP (voice over the internet protocol internet telephone calls) will become just as profitable.
So, when ADSL2+ has been fully rolled out, we will be able to look forward to a future of stable internet technology, where everyone will be know what is on offer and everyone will be able to get it. Well no, not quite. There are already plans to increase transmission rates beyond those of ADSL2+. Look forward to the VDSL standard where rates increase to 30 to 100 Mbits/sec.
AT, June 05
Sony, the inventor of the 3.5 inch floppy disc, has announced that it is to stop making this product. This workhorse version of removable storage has fallen victim to the improving reliability and decreasing cost of CDs.
The floppy disc cannot compete with the cost per MB of a CD and, with the high demand for image storage devices, it is unable to deliver the required storage capacity. It is useless as a system backup storage device. Nevertheless; it has given excellent service and is far more rugged than its predecessor 5.25 and 8 inch diskettes.
There are, of course, other manufacturers of 3.5 inch floppies. However, Sony has declared the death of the medium and it will not be long before it is buried. We all need to plan for that event by transferring must-keep data to CDs. The next PC purchased is unlikely to have a 3.5 inch floppy disc drive some, corner-cutting versions dont have them now.
In the recent past, the reliability of CDs has caused concern with data becoming irretrievable after only a few months of storage (see CD Problems & CD Problems Revisited, below). However, it is now clear that, provided the user is careful with his / her labelling (the adhesive on some labels and the ink in some felt-tip pens attacked the CD) and does not live in the Amazon rainforest (fungal growth was an occasional problem), CDs will provide a useful data storage life. The length of that life depends on the type of CD used.
If long storage life is the criterion for choosing a CD, the CD-R type is obviously preferred over the CD-RW type (why pay more for the ability to overwrite data).
The cheapest CD-R discs are constructed by sandwiching a cyanine dye between a transparent polycarbonate main body and a reflective aluminium foil layer. The writing laser burns pits into the dye through the polycarbonate plastic to record the ones and zeros of the data. These pits can then be read by a low power laser which does not alter them. With this construction (recognisable by its bluish or blue/green tinge), it is claimed that a data storage life of 10 70 years is achievable.
The longest life for CD data storage is claimed for CDs which use a phthalocyanine dye with a gold reflective foil. The foil, seen through the transparent dye, gives the disc a golden colour and this also reflects the high cost. This type of CD has a claimed data storage life approaching 100 years.
Between the two extremes, there are two other types which give a better life than the cyanine / aluminium type but not as good as the phthalocyanine / gold type. These are formazan dye, gold foil type (light green colour) and metallized AZO /silver foil type (dark blue colour).
Remember to keep the CDs vertical and protected from dust.
AT May 05
One aspect of the London 2012 Olympics bid that has not been widely aired is the communications legacy that would be left behind. To properly accommodate the needs of the visiting athletes, hacks and public; it will almost certainly be necessary to upgrade the landline, cell phone and Wifi infrastructure throughout London, but especially in east & central London. Not to do so would put in jeopardy the operation of Londons businesses.
The world-wide popularity of cell phones suggests that this is the area which will need greatest work. However, by 2012, Wifi use will have expanded far beyond its present rather limited employment. Most new laptop computers have a Wifi capability and people are becoming more and more familiar with its use. Taking into consideration the fact that most of the visitors coming to expensive London will have to be fairly affluent, the demand for Wifi services can be expected to be great. Provision will have to be improved and there is a place for public libraries in the plan. Hopefully, the outrageously expensive British Library scheme (£4.50/hour for normal use and 50p/min for help) would not be considered to be the model.
AT May 05
The internet did not loom very large in the recent UK general election. However, analysis of the US Presidential election in 2004 now indicates that it could become another potent source of political comment. The BuzzMetrics and Pew Internet & American Life Project report Buzz, Blogs and Beyond: The Internet and National Discourse in the Fall of 2004" compared political blog activity with that of other media in the last stages of the US Presidential election. Influence and buzz generation (buzz generation = amount of simultaneous talk) was also compared.
The study considered 40 sites (16 conservative blogs, 16 liberal blogs and 8 general blogs). Conservative bloggers made 6,716 unique posts, the liberal bloggers wrote 7,151 entries; and neutral blogs had 4,251 unique posts between 27 September and 31 October, 2004.
The researchers allocated chatter on political message boards and forums to one of the conservative/liberal/general categories and found that there were 984,549 unique conservative posts, 947,503 liberal posts and 98,963 neutral posts in the period.
Campaign releases and official blogs generated by the two political parties were monitored and it was found that there were 955 Republican messages and 835 Democrat messages.
The US public does seem to have taken the opportunity to make its voice heard above the noise from the political parties and may well pose a future threat to the power of the media barons. If the US experience can be replicated in this country, democracy will be strengthened. The opportunity for the public to make its views heard directly, rather than via a media/government filter should act as a counter to apathy.
The danger is that the party political machines may try to highjack the new outlet and thus damage its credibility. The Labour party has been found to have used party activists to append their names to template and other biased letters to local newspapers. These supported party policy without acknowledging the party affiliation. It would be just as easy for the political parties to use this underhand technique on the internet. However, the technology exists to simply and quickly search for common phrases or other groups of words over a wide range of websites. The template nature of such internet messages would then become obvious, as would the original source. Not only would the practice quickly become public knowledge, but it would become self defeating.
More information: https://www.pewinternet.org/
AT May 05
Google is rolling out a new feature for its widely used toolbar. It calls this addition Google Compute and it is the visible part of an initiative to aid the scientific community.
The initiative is to implement in the widest possible way distributed computing a subject that has already been discussed here (see Grid Computing & Grid Computing Revisited). Google is effectively inviting users of its toolbar to participate in scientific computing projects simply by clicking an icon. It is offering two methods of participation, Standard and Conservative. The Standard mode allows Google Compute to run in the background whenever the computer is fully powered up i.e. not in stand-by mode. The Conservative mode allows Google Compute to run only when the computer user is nor actively using the computer. In addition, it is possible for the computer user to specify the amount of processor idle time which is allocated to Google Compute.
At present, the scientific project using the publics computers is the nonprofit Folding@home project at Stanford University. This is attempting to model the geometric structure of proteins a fundamental property of life. Although Google is an American company, one would hope that it would not restrict access to Google Compute to US researchers. If the computers being used are sited internationally, then the benefits should be shared internationally.
Google has attempted to address privacy issues with its Google Compute offering. However, as with its other products, there is the concern that it is a US company, subject to the laws and law enforcement agencies of that country.
AT May 05
Wikipedia, which receives charitable funding from Google and Yahoo, is quickly improving its popularity, according to the online intelligence service Hitwise. The data collected by Hitwise from ISPs and mega panels suggests that in March 2005 Wikipedia was the 33rd most popular site for visits from search engines. The site has been consistently improving its ranking during the second half of 2004 and throughout 2005.
Wikipedia is usually simply described as a free on-line encyclopaedia, but it is also popular as a source of news items. Although news items are more ephemeral than the traditional contents of an encyclopaedia, it is advantageous to have them grouped in one place. The background to the news items can then be looked-up very easily.
The open source Wikipedia is now proving to be more popular than the commercial general reference websites such as Answers.com and Encarta. Wikipedia achieved a traffic rate (market share) of 3.84% compared with 1.9% for Answers.com and 1.81% for Encarta. However, the most popular website in the overall reference category is Dictionary.com with a traffic rate of 4.46%.
AT May 05
IBM has designed a small unit which is intended to help people with hand tremors to use a computer mouse. It has granted a licence to a small UK company, Montrose Secam Ltd., to manufacture the unit. At less than £70, incl VAT, the unit is affordable by most libraries and represents a significant improvement in the service that can be offered to a segment of the community which is often ignored.
AT. Apr 05
On 24th February a US report was released which
suggested that 2005 might be the year that broadband power-line technology
(BPL) becomes a realistic third option for
computer users adding to the existing high-speed connection options of cable or DSL.
BPL operates in a similar way to that of DSL
and cable. The signal is carried
along existing power lines and emerges through any standard electric outlet into a
modem that is plugged into the socket.
The current BPL speeds are 500 kilobytes to 1 megabyte per second, although some of the world's major modem producers are developing technology allegedly capable of 100 megabytes per second.
Unsurprisingly, urban areas are expected to provide the main market for BPL.
The Washington based New Millennium Research Council report concluded that consumer and regulatory issues are by no means insurmountable.
BPL is not being ignored in Europe. The telecom standard organisation, ETSI, carried out laboratory tests and the French electricity supply company, EDF, carried out field tests on medium and low voltage networks at a site just outside Paris in November 2004. The electricity supply companies are not yet completely convinced that providing communication systems for the public is a good commercial proposition. However, they can see great benefits in using the technology to manage their networks.
AT Mar 05
MORE on BPL
At the Intel Developers Forum in San Francisco on 24 Aug 2005 - The HomePlug Powerline Alliance, the powerline communications standards group, announced that Intel Corporation, Linksys and Motorola, Inc. had become new sponsor members. Subsequently Intels Matt Theall was elected as president of the Alliance.
The inclusion of these electronics industry heavyweights in the group suggests that powerline communication systems are now being taken very seriously, at least in the USA. There appears to be only one UK member of the group Pace, the set-top box manufacturer.
AT Sep 05
A recent market survey by Janco Associates has shown that the Mozilla Foundations Firefox browser has become the second most used browser in North America with 4.48% of the market. Of course, this is a long way behind MicroSofts Internet Explorer (84.85%). However, Mozilla can claim that every user of its product made a deliberate product choice something that MicroSoft cannot claim, due to its past bundling policy.
Although the Firefox market share is miniscule compared with that of Internet Explorer, it is still impressive - Firefox was only upgraded from the beta version in November. Even more encouraging for Mozilla is the fact that Firefox is the only widely used browser which is increasing its market share.
The rapid growth in market share of Firefox has been mainly driven by the blogger community where its market share is a very respectable 35%. This suggests that there is much more growth to come for Firefox, as knowledge of it leaks out into the general population.
The also rans in the browser race are Netscape (3.03%), AOL (2.20%), MSN (0.58%) and Opera (0.34%).
MicroSoft does not have a history of tolerating competition and another study, by Gartner, concludes that the software giant will respond to protect its market position. It certainly was not slow to do so in its war against Netscape.
Ironically, the destruction of Netscape by MicroSoft is responsible for the creation of the Mozilla suite of Firefox (web browser) and Thunderbird (mail client). One of the last acts of Netscape as an independent company was to make available to the open source community the Netscape source code and Mozilla has built on this.
Version 1 of the Thunderbird email client was released in December 2004 and is a direct competitor of MicroSofts Outlook Express. In the first month after its release, two million downloads of the software were made. The advantages of Thunderbird over Outlook Express are far faster operation and an inherently cross platform product (the open source legacy).
UPDATE ON FIREFOX
A market survey published on 26 April showed that Firefox has continued to find consumer favour and has more than doubled its market share (10.28%) at the expense of Internet Explorer (83.07%), AOL (0.85%) and Netscape (0.92%).
More details: https://www.e-janco.com
The way that the internet scene is developing in the USA is of particular interest to the UK, as there is a tendency for the UK to follow the same path. A survey carried out by International Demographics found that the US growth in internet use has changed direction. Young people were the initial driving force but, in the last four years, internet growth has been dominated by people who are over 55 years old.
AT, Dec 04
Mobile phones are disruptive and annoying in a library environment and most public libraries are plastered with notices asking people to turn them off. These requests are comprehensively ignored and the nuisance goes on. Most library staff seem to have given up making an effort to enforce the rule, just as the police have opted out of enforcing the law on using cell-phones while driving.
So, the mobile phone has taken over and there is nothing that we can do about it. Well, not quite. QuinetiQ, the defence research organisation, has perfected the manufacturing process for a wallpaper that will screen out the mobile phone signals. This process prints a frequency selective metal pattern onto either flexible or rigid substrates using far fewer production stages than would be necessary at present. It is therefore very cost effective.
The next time that a library needs to be refurbished or built, it would be a good idea if this type of wall surfacing was included in the specification.
Of course, there would have to be consideration of the radio-frequency leakage effect of windows. However, the new process can be used on rigid substrates, so it is probably extendable to glass. Even if it was not, the technology of incorporating metal patterns in glass is already well established for car window heaters and adhesive, transparent screening films are available.
Just think. All those notices asking people to turn off their mobile phones could be replaced by notices telling them that they are entering a mobile phone quiet area there would probably be a queue to get in.
One side effect of screening a library from mobile phone signals is that any wireless local area network set up in that library could be made inaccessible from outside usually considered to be a good thing.
At a Developer Forum in Frankfurt recently, Tim Johnson of UK research company Point Topic suggested that broadband internet use is growing at a faster rate than mobile telephone use ever managed to achieve. He said that the present estimate is for 150 million broadband lines to be installed by the end of 2004 and that South Korea is the country leading the change to broadband use with 24% of the population having this type of internet access.
The Ofcom April 2004 broadband connection figure for the UK is 3.99 million (2.45 million DSL connections + 1.54 cable connections). Comparison with the South Korea figure is difficult, as it is not clear from the Ofcom data how many people use each broadband connection in the UK on average. However, Ofcom states that 15% of homes have broadband, so it is reasonable to suppose that the UK cannot be very far behind South Korea.
BT has frequently been cast as the villain in the broadband story. It has been accused of deliberately delaying conversion of telephone exchanges in order to obtain a commercial advantage. However much truth there is (or was) in the accusation; it is becoming irrelevant, as 84% of UK homes and businesses now have the DSL service available and this will soon (summer 2005) be improved to 99.6% of all households. Although the cable availability map overlaps the DSL one, the actual overall broadband availability will be slightly higher. Even those homes a long way from an exchange may soon be able to benefit from a broadband connection, as BT begins 500kb/s ADSL trials over 10km lines.
Next year, the telephone exchange conversion program will be virtually complete and it is obvious that the government deadline of all homes having broadband capability (if required) by 2008 will be met. It only remains for the potential users to be persuaded that high-speed internet access is worth the money. Once the infrastructure is in place and paid for, there will be an enormous pressure on the service providers to maximise the utilisation of that infrastructure. Thus, it is to be expected that the cost of a broadband contract will continue its downward slide and the services provided will be expanded further.
This view of the future is not restricted to the writer. During its recent, first Investment Analyst Conference, Ofcom stated: By the end of the decade, the successful introduction of greater infrastructure competition combined with continued innovation in access technologies, could enable a majority of UK households to benefit from affordable and accessible broadband connections delivering video-quality bandwidth [i.e.very wide bandwidth].
The question arises as to how this changing situation will affect Londons Library Services. In the short term, there probably will not be much of an impact. There will be a continuing need to provide a computer literacy education program and to ensure that internet access is not exclusively for the affluent. Lending of books, CDs etc will go on as before.
In the long run, when hardware and on-line access costs have reached a very low level, the public internet access facilities now provided by public libraries will be of small value to the community and their popularity will wane. Further, the computer literacy initiative will have worked its way through that part of the adult population reachable by such a program (the children are and will be covered by school programs). At best, there will be a small residual requirement for on-line access for visitors to the capital and a minority of residents. The Peoples Network will have done its work and will probably need to be retired.
Long term developments in the range of services provided on-line and new hardware designs are likely to have an increasing impact on the core library activities that are, at present, not seen to be related to computer use. Probably the first areas to be affected will be the CD / DVD hire activity. With fast download, 24/7 availability and a vast catalogue to choose from; the on-line businesses will have major advantages over any high street operation whether in the public or private domain. Added to this advantage is the inherent low overhead cost applicable to the on-line businesses.
It is just possible to see the beginning of this shift in market position with the pain now being experienced by traditional record shops. The more volatile part of the record industry, i.e. the pop record part, is suffering badly now, but the classical end is just as vulnerable in the long term. The record industry may well succeed in combating copyright infringement. However, this will do little to help the high street record shop the die is cast and public libraries will not be immune.
Once it is accepted that the contents of CDs can be obtained more easily and more cheaply via the internet than from a public library, it becomes clear that all associated electronic media will eventually also fall into this category. Thus the public library video tape and DVD collections, which have been so enthusiastically built up, will become wasteful space fillers from an income generation viewpoint. At some point, it will be necessary to decide whether to remove these items from the selves or to remove the charges. If removal from the shelves is the preferred option, will books be substituted?
There are many electronics development projects still in the research phase which will accelerate the trend already identified and these will eventually come to bear on all core public library activities. In the very long term, even the book lending activity will not be immune from radical change.
Speaking at the recent Public Library Authority Conference; Lord McIntosh, the Libraries Minister, stated Books are here to stay because people want them and they will be a key element in the library service in the foreseeable future. It is true that books are very popular at present, but one should ask why this is before assuming that they will continue to be so.
After 500 years of printing development, books have become relatively cheap, convenient, easily obtainable and easily used containers of information (factual and fictional) i.e. it is the currently preferred delivery method for most detailed content. If an improved delivery method becomes available, it is inevitable that it will replace the currently preferred one. The research is underway in many parts of the world in preparation for this new product type.
There is little doubt that Library Services, over the next few decades, can be expected to change out of all recognition. Interesting times?
The concept of the portable electronic book originated in 1968 and has since been slowly developed. The speed of development has been dictated by the lack of suitable electronic components. However, this situation is changing and the rate of change is accelerating.
In particular, the range and quality of displays is improving fast. It is still not possible to find a commercial, portable display which can match the resolution available from good quality printing on good paper but, after less than half a century development, it is possible to believe that this standard will be achieved in the short to medium term. This should be compared with the half a millennium required to reach this quality level by the print industry.
Late April will see the introduction of Sonys e-Book reader, LIBRIe. This utilizes the Phillips / E Ink electronic ink technology to give a monochrome display with a resolution of approximately 170 pixel per inch (similar to newsprint). Although this resolution is not high, it has certainly proved to be acceptable to millions of newspaper readers throughout the world.
The real advantage of the LIBRIe display technology is the fact that it is non-volatile i.e. it only requires to draw electrical power when changing the display contents. This mitigates the second problem associated with the electronic book concept, the poor power density available from current batteries. High power density batteries are a prime requirement for a successful e-book machine, as the reader must be light-weight and easily handled. Although a considerable research effort is underway to solve this problem, commercialisation of the results has not yet materialized. The LIBRIes power source is made up of four cheap, widely available, standard AAA batteries, which is sufficient to read 10,000 pages.
One inherent advantage of the e-book concept is the number of books that can be carried by one person a fallout from the intense development effort that was aimed at improving computer storage technology. The LIBRIe can store 500 downloaded books. The range of books commercially available is continually increasing and, with programmes such as the Gutenburg Project (see below) dedicated to increasing the availability of digital books, there is no doubt that there will be sufficient content to justify the LIBRIes storage capacity.
There have been several false starts in the e-book area and it is just possible that this new entrant will be another one. However, it is clear that each new entrant moves a step closer towards the ideal product and eventually there will be a product that is sufficiently close for the public to enthusiastically use it. If the experience of similar products is repeated, when that happens, the switch is likely to be very fast. This has profound implications for publishing, book retailing and libraries of all kinds. The mess that the recorded music business has got itself into by ignoring technical developments would pale into insignificance, if the book world followed suite.
AT April 04
One of the problems with implementing a community WiFi system is the short range obtainable with the standard antennas available for the IEEE 802.11 a/b/g specifications. This is now being addressed by Intel and other companies as they carry out development of components designed to meet the provisions of the new standard IEEE 802.16a.
WiFi was originally intended to only give communications over a relative small area such as offices or cafe hotspots. However, the potential for more universal use was quickly recognized and many community-wide schemes have been successfully set-up. These have needed careful design in order to overcome the inherent limitations of the WiFi specifications.
The new WiMax standard is intended to provide greater range and ban