Google to finish UMich scanning in 5 years- what’s that mean?
http://www.detnews.com/apps/pbcs.dll/article?AID=/20070413/BIZ04/704130354/1001/BIZ
I’ve started doing some actual research (and by that, I mean following a more formal methodology with the intent of writing something) about the University of Texas and the Google Project. I’ve been thinking about it and working a bit on it throughout the semester, with some helpful discussions with Georgia Harper and my committee as well as other students, librarians and library employees. One things I’ve noticed that news tends to make a big deal about certain facts about the speed at which Google digitizes materials, like in this article:
That’s amazing to Wilkin, who also leads the university’s own digitization project that began before the school partnered with Google. The in-house project scans about 5,000 volumes a year. At that pace, scanning the entire library would take 1,400 years.
And sure, that’s very interesting- but if the Michigan project is like the Texas project, it’s not quite as amazing. Or rather, it’s amazing on its face, but it’s not something that should really be compared to the library’s digitization efforts. The digitization going on by Google and the digitization university libraries are doing in this situation are two different (and exclusive) actions that serve different purposes. Comparing them isn’t entirely apt.
The broader question of what Google’s digitization means, I’ll be looking at later.