OPENING THE GATES TO INFORMATION COMMONS
(ShelfLife, No. 189 (January 13, 2005) ISSN 1538-4284
While respecting the right of corporations to charge for information, some information professionals are calling for fewer restrictions on its distribution and are lobbying for, or actively participating in, the creation of "information commons" -- a new way of producing and sharing information, creative works and democratic discussions. Like information portals, these "commons" (drawn from the historical existence of the English commons -- pieces of land to which members of a community had specific rights of access) are digital repositories of thematically related information. The information may include everything from scholarly journals to information on knitting. However, instead of being run by corporations, they tend to be run in a collective manner by like-minded individuals -- associations or university departments for instance -- and they are accessible to all. Proponent Marjorie Heins, a former American Civil Liberties Union lawyer and founder of Free Expression Policy Project, doesn't support free distribution of all information; her main concern is the "copyright mentality" that sees media giants attempting to squeeze the last dollar out of all content they control "rather than striking a more reasonable balance between fair return for effort and tying up information... The balance has gone awry." (Information Highways Nov-Dec 2004)
Recently in Information, Knowledge Category
OPENING THE GATES TO INFORMATION COMMONS
The magic that makes Google tick, an article worth reading if you are interested to learn how things work behind the scenes before and after you type your query into Google's search box.
Among the nicely said things in the article, here is a quote that is a bit sarcastic and arrogant:
The job is not helped by the nature of the Web. "In academia," said Hölzle, "the information retrieval field has been around for years, but that is for books in libraries. On the Web, content is not nicely written -- there are many different grades of quality."
Surely Google has done a lots of progress in implementing IR knowledge to a very practical problem, but isn't it a show or arrogance to claim that academia has not helped (directly or indirectly) Google with their search technology?
December's Issue of D-Lib Magazine brings and interesting article regarding the implication of RSS in the science and research publishing. The Role of RSS in Science Publishing is worth reading. Yet another practical example of how blogs have brought forth a tool that can change the nature of the web as it is traditionally known. Website are no longer the static domains, RSS helps the sites be distributed widely, most importantly as a two-way communication.
The following few paragraphs where prompted from a discussion with a colleague of mine about the philosophical link to/from information science.
Well, I think that any practical disciplines or field of study is definitely informed by some philosophical discourse, even when the discipline itself does not acknowledge it, or does not seem to see it. In this sense, the field of Information Science(s)/Studies seems to lack an acknowledged philosophical grounding, even thought there are some obvious links of philosophical discourse. Imagine, many books and articles regarding information science do not emphasize the philosophical links (or if they do, they do so scantly, superficially and individualistically), or just start with practical issues, as if the phenomena treated by information science become part of the discourse just like that? Part of the phenomena treated by a discipline or field of study do emerge from practical problem, however, we should not neglect the phenomena that could arise from the philosophical discourse. The philosophical link might not be an obvious one, or it might not seem as a valuable enterprise worth research, thus, what would be the point in pursuing such a link for scholarly work. However, there could as well be very beneficial links.
Understanding the philosophical fundamentals/groundings that have informed and are informing information science/studies (implicitly or explicitly) might lead to a better understanding of the common elements that give rise (or are constitutive elements) to the phenomena treated by information science, thus, might provide us with a more coherent framework to treat such phenomena... to be continued...
This paper (Do Open Access Articles Have a Greater Research Impact?) reports its findings that "freely available articles do have a greater research impact. Shedding light on this category of open access reveals that scholars in diverse disciplines are both adopting open access practices and being rewarded for it."
The findings of this paper have just confirmed what seems to be an obvious argument: the more open the accessibility to articles is, the more they will be used, and thus they ought to have greater impact in research and practice.
An additional question that needs to be addressed in this context is the overall impact of articles published in open access journals. It is quiet possible that articles published in open access journals might be able to shift the focus of a discipline or a field of study because of their wider availability and accessibility.
"Beagrie: In the right conditions papyrus or paper can survive by accident or through benign neglect for centuries or in the case of the Dead Sea Scrolls for thousands of years. It takes hundreds of years for languages and handwriting to evolve to the point where only a few specialists can read them.
In contrast, digital information will never survive and remain accessible by accident: it requires ongoing active management. The information and the ability to read it can be lost in a few years. Storage media such as paper tape, floppy disks, CD-ROM, DVD evolve and fall out of use rapidly. Digital storage media have relatively short archival life-spans compared to other media. As the volumes, heterogeneity, and complexity of digital information grows this requirement for active management becomes more challenging and more critical to a wider range of organisations."
I already have a problem reading/opening some papers/files that I wrote during my undergrad studies using WordStar (or something similar) in a school computer lab.
A brief overview of Mayomi (an Online mind-mapping tool and community) reveals that ANT can be used to analyze and trace the connections between various elements/actors. As it can be observed from the first page, the elements are human and non-human, some task oriented, others action oriented, as well as social and information structures, etc, making it a good fit to be analyzed by the ANT framework, unless they were designed and developed based on the ANT framework and methodology. Hope to write more about this once I use the tool. A similar tool is FreeMind which I just installed.
Culture of secrecy hinders Africa's information society covers few interesting ways the mobile telephone technology is being used in Africa. It is evident in the article that the use of mobile technology is being redefined and continually socially constructed by the social and monetary resourced available.
Among the other interesting paragraphs, this one is really revealing:
"The worst thing is that it is a short step from a culture of withholding information to that of becoming information-blind. In other words, when we keep on withholding information, we end up being unable to produce information. We lose the culture of surveying, assessing, classifying – in brief, collecting as much information as possible and storing it in a standardized manner, making it available for use, not only to cater for current specific needs, but also for potential and future ones."
Along the lines of this article's argument, it can also be explained why text messaging is lagging in the US behind Europe and Asia. Most cell/mobile phone service plans in the US come with certain amount of 'free' minutes included in the plan. So, if you have free minutes to use, you use them first before sending any text messages, but also because the mobile telephone devices in the US market are less 'text messaging' friendly. In contrast, in Europe you pay for each minute you talk, and you use text messaging because it is cheaper than talking; thus the social co-construction of the mobile telephony service and the technology, and its use.
The following article Rules for a Complex Quantum World: An exciting new fundamental discipline of research combines information science and quantum mechanics presents a fundamental new way of looking at information science. As a framework in making, it builds upon Shannon's information theory and Buckland's "information-as-thing", as well as quantum physics. It appears that this approach is closer to physics than the contemporary information science studies that deal primarily with information from the meaning making viewpoint.
Could this pave the way for the groundwork towards the unified theory of information?
Google  is great. Personally, I use it every day, and it is undeniably extremely good at finding stuff in the largely unstructured chaos that is the public Web. However, like most tools, Google cannot do everything. Faced with a focussed request to retrieve richly structured information such as that to be found in the databases of our Memory Institutions , hospitals, schools, colleges or universities, Google and others among the current generation of Internet search engines struggle. What little information they manage to retrieve from these repositories is buried among thousands or millions of hits from sources with widely varying degrees of accuracy, authority, relevance and appropriateness.
This is the problem area in which many organisations find themselves, and there is a growing recognition that the problems are bigger than any one organisation or sector, and that the best solutions will be collaborative and cross-cutting; that they will be common and shared. The Common Information Environment (CIE)  is the umbrella under which a growing number of organisations are working towards a shared understanding and shared solutions.
While we have many blogging and other social software tools that enable the 'creation' of the collective, how do we harness the "collective intelligence" once it is 'there'/'built'? It would seem that other tools would be needed to enable quick and relevant utilization of the collective intelligence. So far, it appears that the blogging tools have done a great job enabling the representation of the collective intelligence. They lack the function as enablers for utilizing the available collective knowledge.
It seems that the next wave of social network and collaboration tools will/should concentrate more on the function of finding relevant and appropriate 'intelligence' somewhere in the collective pool. Needless to say that search engines are not best suited for this type of activity since they concentrate primarily on topical relevance and do little to nothing about spatial, temporal, methodological, contextual, process, and task specific relevance.
"Shareability refers to the extent to which information is shareable. Information has high shareability if it is easy to share between different individuals without loss of fidelity. Shareability theory (Freyd 1983, 1990, 1993) proposes that internal (e.g. perceptual, emotional, imagistic) information often is qualitatively different from external (e.g. spoken, written) information, and that such internal information is often not particularly shareable. The theory further proposes that the communication process has predictable and systematic effects on the nature of the information representation such that sharing information over time causes knowledge to be re-organized into more consciously available, categorical, and discrete forms of representation, which are more shareable."
The distinction made above between internal and external information sounds almost exactly like the distinction made in the Knowledge Management (KM) discourse between tacit and explicit knowledge. Furthermore, the definition does not seem to make distinction between information and knowledge, even though such distinction appears to be very relevant in the context of the definition.
Another concern that might further help the above definition or the theory of shareability is to note that it is not the knowledge that is organizable, rather, it is mostly the representations of the explicit knowledge, and to a much lesser degree the representations of the tacit knowledge (if at all).
And one more thing, in the spirit of the concept of openness, for something to be shared it first must be open to change (open content) and the access to it must be also open.
Bo-Christer Björk: Open access to scientific publications - an analysis of the barriers to change?:
"One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject-specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass."
When discussing the subject of digital libraries (DLs), often the very definition and meaning of the phrase "digital library" is questioned. This is expected due to the historical, practical and theoretical development of digital libraries as technologies (computer and information systems) as well as social structures.
Below I provide two definitions by Borgman (1999) and Lesk (1997) that have been widely used by practitioners and researchers. Needles to say both definitions embody the technical and the social nature of digital libraries.
Borgman (1999) attempts to explicate the meaning and interpretation of the phrase "digital library" through the analysis of various definitions regarding "digital libraries" coined by various research and practice communities claming to be somehow related to digital libraries, and to assess and identify possible influences of those definitions in the relevant communities. Borgman identifies two distinct senses in which "digital library" has been used (p. 227). The technological definition stating that "digital libraries are a set of electronic resources and associated technical capabilities for creating, searching and using information" (p. 234), is contrasted by the social view stating that "digital libraries are constructed, collected and organized, by (and for) a community of users, and their functional capabilities support the information needs and uses of that community" (p. 234).
Another workable and widely used definition is provided by Lesk (1997): "Digital libraries are organized collections of digital information. They combine the structuring and gathering of information, which libraries and archives have always done, with the digital representation that computers have made possible" (p. XIX).
Borgman, C. L. (1999). What are digital libraries? Competing visions. Information Processing & Management, 35 (3), 227-243.
Lesk, M. (1997). Practical digital libraries: Books, bytes and bucks. San Francisco, CA: Morgan Kaufmann
In Prediction Thijs van der Vossen has stated some ideas about how things will be in the future in terms of information and knowledge sharing.
While I agree that what Thij's writes is the desired outcome if we are doing towards a more open world, the outcome is not necessarily so. Yes, information needs to be free so it can be accessed from everywhere, by everyone, through many different devices and access methods. However, the assumption is that the corporate entities will be willing to let go the grip they have on everything information that looks profitable.
So, one of the fundamental assumptions is that all sources of information and knowledge artifacts really want to share their content. In the open source Internet as a possible antidote to corporate media hegemony I have argued that the property of openness (open content and open communication) as a fundamental property of the Internet as we know it today, is perhaps the reason why Thij's predictions look very probably. Hopefully no authoritative entity puts restrictions around what can be said and done online.
The idea that search engines (SEs) suppress controversy is indeed real. As it is argued in Do Web search engines suppress controversy?, the suppression is not intentional, however, Google's bottom line means good results and quicker, not necessarily attempting to cover all the sides of the story/issue which an information seeker is trying to find information about.
I've tried to explain the sort of mediating power/role by SEs in earlier blog entry: search engines' meaning mediation power,
Google does it again. Like with many of the practical implementations in the search world, Google is first again. First in implementing it in real world, not necessarily in research. As far as research is concerned, personalized searches have been discussed plenty.
This new personalized web search by Google utilizes facet aided searches.
The entire search is dynamic. Once you setup the profile, very simple and menu/directory driven, the left side shows the built query. You can still type a search term. The FAQ shows a bit how things supposed to work.
In any case, the search is operational (beta) and once the relevant docs are returned, there is a small sliding bar that can be moved left-right in order to dynamically relax-restrict the personalization.
Interesting stuff! Just when you think you have learned how Google works! :)
Now, all other search engines would try to do the same. Why don't they start something before Google does it for a change?! What are they afraid off?
(thanks to unstruct.org for the link)
"HIGH-SPEED DIGITIZATION AND THE FUTURE OF LIBRARIES
A robotic scanner, custom built for Stanford University, is systematically digitizing parts of the university library's vast collection -- over eight million volumes. Resembling a giant copier, the 4DigitalBooks robot quickly and automatically scans about 1,000 pages per hour -- a complete 300-page book in 20 minutes. Stanford University Librarian Michael Keller, who oversees the project, says, "It's rigorously consistent -- the page is always flat, the image is always good, and software conversion allows you to index the text so you can search it." Rare books, however, are another matter. "We're very concerned about (them), so we haven't put any manuscripts on the robot. Instead, we use a technology based on the same cameras, (but turn) the pages by hand." In the next 10 to 20 years, Keller believes more and more information will be presented in digital form. "I suspect books will continue to be useful and important, and we'll (still) see them published. But people will find more and more of their information online, and the number of books will decrease." Stanford, for instance, is planning a science and engineering library whose goal is to have no books on the shelves. "We'll still need physical libraries," says Keller, "because people want to meet with one another. They want to work on
projects collaboratively, and they also like to work in clusters and groups." (The Book & The Computer 15 Dec 2003) http://www.honco.net/os/index.html"
The following paragraph is an excerpt from my previous blog entry (Actor-Network Theory and Managing Knowledge), triggered for repetition after I read James Robertson's Making meaning who refers to Denham Grey's share meaning.
Next, I would like to demonstrate the naming and the power of the semantic tool with two examples reflecting from my personal experience upon embarking on the Ph.D. program. First, I would like to describe the performative power of the marks I inscribe on the pages of articles and books I read for my classes. Usually, at the start of a new article, more so if the article presents concepts that I perceive unfamiliar, my red pen inscribes all sorts of marks (stars, checks, circles, question marks, exclamation marks, underlining, etc.) with their intended and perceived importance. The first run through the article produces a set of marks placed mostly in the sidelines of article’s pages, each of them with their perceived meaning of what I think is important and necessary for me to master the ideas presented therein, or because I believe that a particular quote will be useful later on. At times I wonder if I’m overdoing with these marks as the more I inscribe they tend to loose their relative significance. The topology of the marks on the pages would have been much different (in relations to each other as well as their quantity) had I had some prior understanding about marks’ meanings. Nevertheless, the point I’m stressing is that the marks tell me different things the next time I look at them for just making sure I have understood a concept or for review purposes. Sometimes I even wonder why have I underlined a certain sentence. At other times I discover that I have missed a certain concept. However, the result is that these entities have performed on my knowledge structure and have also been performed upon themselves, as some of them do not carry the same meaning I attached to them at the time of inscription. The topology in this case would be the article with its marks and also my knowledge structure. However, the article is also part of other topologies, such as the set of articles written by the same author, the set of articles contained in the journal it was published, the disciplines or studies it was intended to perform upon, etc. In my case the marks have performed mostly vertically (affecting my knowledge structure). They probably would not mean much to anyone else, unless I write a page of rules and guidelines describing the marks together with their intentions as I perceived them. But this would be a very hard task because they might not be of much benefit to others given the personal performative nature. (Full article)
Interesting development regarding this software. In New 'NBOR' Software to Debut Next Month Yahoo reports:
"A few hundred thousand lines of computer code could revolutionize the way people interact with computers, say its unlikely inventor and his backers."
"... includes an intuitive user interface for writing, drawing, compiling multimedia presentations and other PC tasks. It allows real-time collaboration and sends large files over the Internet at lightning speed."
Nothing new so far. Sound like just a hype... BUT, you never know. One point of skepticism is the claim to 'intuitivety'. Many have said this before... Also, it appears to be an all-in-one solution and so far such applications have not been very successful in being adopted by the wider public. Let’s wait and see...
I had come across few times before on pieces describing the potential of PowerPoint to dumb-down people's way of thinking. The same is suggested in the following CNN article Does PowerPoint make us stupid?.
That technology affects social structures and other social phenomena (personal and/or at the levels of society) is widely acknowledged, and perhaps is hard to argue otherwise. These effects are not necessarily negative or positive. The effects depend on the context, i.e. the contextual imbedness of the technology within the social structures. (more about social constructionism vs. technological determinism)
Does PowerPoint (as a technology) have the same capability? Surely. However, to what extend is it able to effect individual’s way of thinking as far as presentation of information and facts are concerned? Just because it affects an individual it does not mean it has an affect on all faculties of reasoning and thinking of that individual. It can be argued that it does, however, one needs to be cautious not to jump to quick conclusions without the proper research and deeper understanding of the situation.
Like any other technology, PowerPoint tries to simplify 'things'. In the process of simplification the complexity of the context (including the content and the social structures) is usually 'relaxed' and many details are lost. I would argue that this is rather an unfortunate situation because each simplification chips little by little out of the reality which is not necessarily simple. Thus, as a result many complex phenomena are simplified to a great extent, up to a point where a new phenomenon is 'born' because the simplification really dumbed the complex phenomenon to an unrecognizable one.
Whether the above reasoning is true for PowerPoint's ability to dumb us down is another matter. If it is true, I would suggest that the context is also responsible and not PowerPoint alone. Nevertheless, it is important to understand the scope as to how wide is this even practically possible. Theoretically one can argue that an element in a socio-technological context can have extensive effect in terms of time and space/distance. Practically, it should be analyzed whether the element, PowerPoint in this case, is first and foremost able to effect the individual beyond the presentation mode thinking, or does it also effect individual's other intellectual faculties. (more on Actor-Network Theory and Managing Knowledge)
Maybe it is the immediate relevant context (organization, corporation, society, etc) that has been dumbed-down enough that simple presentation tools like PowerPoint suffice? Or is it PowerPoint? I would think it is both to some extend: complex thoughts, ideas, and solutions are hard to present in their complexity in the expected 30-45 minutes timeframe usually allocated for presentation to managers. That is why we have journal articles and research papers.
In To much information Nathan Cochrane makes a good point that despite the multitude of tools at our disposal that manage and manipulate information we have not necessarily become more informed decision makers. Perhaps the evens and issues we need to make informed decision have become so complex that the current tools (based on the utilitarian theoretical foundations) do not help us much.
Who Owns The Facts?
(courtesy of slashdot)
"windowpain writes "With all of the furor over the Patriot Act a truly scary bill that expands the rights of corporations at the expense of individuals was quietly introduced into congress in October. In Feist v. Rural Tel. Serv. Co. the Supreme Court ruled that a mere collection of facts can't be copyrighted. But H.R. 3261, the Database and Collections of Information Misappropriation Act neatly sidesteps the copyright question and allows treble damages to be levied against anyone who uses information that's in a database that a corporation asserts it owns. This is an issue that crosses the political spectrum. Left-leaning organizations like the American Library Association oppose the bill and so do arch-conservatives like Phyllis Schlafly, who wrote an impassioned column exposing the bill for what it is the week after it was introduced."
"Problem was, it took about two years for the article to wind its way to publication. And by that time, many of the sites they had cited had moved to other locations on the Internet or disappeared altogether, rendering useless all those Web addresses -- also known as uniform resource locators (URLs) -- they had provided in their footnotes."
I think the problem was perhaps with reliance on those URLs knowing that they can not be relied for an extended period of time.
"Dellavalle's concerns reflect those of a growing number of scientists and scholars who are nervous about their increasing reliance on a medium that is proving far more ephemeral than archival."
It isn't the medium; it is the not so rigorous self discipline of individuals who put serious and valuable scholarly material on websites that are not maintained.
This is why 'permalinks' used by bloggers are so great.
Further, I would like to stress that the problem is not with the internet as a publishing medium, rather with the publishing strategies followed by self publishers. Self archiving could effectively resolve this problem. Or, even better, a publishing framework like MIT's and HP's DSpace that cares tremendously about preservation and permanency of locations. Another such digital library system that can be used to preserve publications and URLs is Greenstone.
"There is overall agreement that various historical transformations are taking place and that in their own way these steps are creating their own dynamics in the process of evolution of the social framework. There is consensus that this inter-action is leading to divergence as well as convergence amid all the changes. The principle of obtaining shared knowledge and the learning process itself, while augmenting mutual growth and understanding, are also creating distinctions. This is partially because global society is not always promoting equitable chances and opportunities for human welfare and there is absence of orderly interaction and sustained cooperation to reduce uncertainties and inconveniences at the global level."
"Knowledge based education has now become central to the creation of the intellectual capacity on which knowledge production and utilisation depend. We have to promote lifelong-learning practices and update knowledge and skills if we are to retain competitive advantage. Traditional institutions have an important role to play in this regard. They have to take advantage of the opportunities offered by the new information and communications technologies. Failure to do so will mean the widening of the digital divide that is facing most of the developing countries, particularly the low-income countries."
I've been puzzled for some time as to what is meant by the "semantic web" phrase and what does it mean in practice and research. I've come across the following article The Semantic Web, today that appears to be describing the semantic web concept(s) in a clear and presentable way.
The article makes the following distinction:
"The key point of the semantic web is the conversion of the current structure of the web as a data storage (interpretable only by human beings, that are able to put the data into context) into a structure of information storage."
I can understand the above intention and the attempt to make a distinction between data and information. However, the distinction between data and information that we make in our heads and understanding does not mean much to computer software.
Further, the article states:
"The Semantic Web is based on two fundamental concepts: 1) The description of the meaning of the content in the Web, and 2) The automatic manipulation of these meanings."
As far as 1) is concerned, the description itself is just another data (or information), i.e. metadata (or metainformation). In any case, the proper software tools have to be build to 'understand' the metadata/metainformation.
As far as 2) is concerned and the manipulation of meanings, this is a bit skeptical because to the machines, as I've tried to explain elsewhere here and here, those descriptions are just data it can manipulate and not meanings.
No, I don't believe that metadata and metainformation will not be able to provide a level of quality in the process of information seeking and access to information, I'm just a bit skeptical about the hype and high level of optimism that the semantic web will deliver us from the chaos of the web.
An interesting parallel are the natural languages. Each language is composed of words and phrases that have certain meaning(s) and/or concepts attached to them. To be able to navigate within the conceptual space of the language (i.e. understand the language) one needs to learn what each of the words represents: because each word or phrase is a metadata/metainformation for the actual concept in the particular language. So, it is good to be optimistic that eventually we'll come around to be able to represent the vast and chaotic multitude of information on the web with a set of metadata/metainformation and ontologies that all software will 'understand'.
Well... Esperanto hasn't yet become the world language it was meant to be... And it does not seem that it will became anytime soon... And even if it does, there still will be multiple meanings for various phrases...
HOW MUCH INFORMATION 2003? is an interesting study trying to estimate the amount of new information (i.e. information-as-thing) created each year. It is important to note the emphasis on information-as-thing in this study.
From the page:
"This study is an attempt to estimate how much new information is created each year. Newly created information is distributed in four storage media – print, film, magnetic, and optical – and seen or heard in four information flows – telephone, radio and TV, and the Internet. This study of information storage and flows analyzes the year 2002 in order to estimate the annual size of the stock of new information contained in storage media, and heard or seen each year in information flows. Where reliable data was available we have compared the 2002 findings to those of our 2000 study (which used 1999 data) in order to identify trends – recognizing that 1999-2002 were years of relatively low economic activity. The 2000 study is located on the Web at http://www.sims.berkeley.edu/how-much-info/. Note that this – the 2003 study – has revised certain of the 1999 estimates when we have found new and better data sources."
I just came across the Directory of Open Access Journals and was amazed at the number of open access peer-reviewed Library and Information Science journals. The "Directory of Open Access Journals ... covers free, full text, quality controlled scientific and scholarly journals ... [with the] aim to cover all subjects and languages."
Open Source Everywhere by Wire's Thomas Goetz.
A must read article elaborating and explaining various aspects of the open source philosophy most widely apparent and spread in software development.
"We are at a convergent moment, when a philosophy, a strategy, and a technology have aligned to unleash great innovation. Open source is powerful because it's an alternative to the status quo, another way to produce things or solve problems. And in many cases, it's a better way. Better because current methods are not fast enough, not ambitious enough, or don't take advantage of our collective creative potential."
Check these open source efforts mentioned in the arrticle:
- OPEN SOURCE FILM
- OPEN SOURCE RECIPES
- OPEN SOURCE Π
- OPEN SOURCE PROPAGANDA
- OPEN SOURCE CRIME SOLVING
- OPEN SOURCE CURRICULUM
"Software is just the beginning … open source is doing for mass innovation what the assembly line did for mass production. Get ready for the era when collaboration replaces the corporation."
"But software is just the beginning. Open source has spread to other disciplines, from the hard sciences to the liberal arts. Biologists have embraced open source methods in genomics and informatics, building massive databases to genetically sequence E. coli, yeast, and other workhorses of lab research. NASA has adopted open source principles as part of its Mars mission, calling on volunteer "clickworkers" to identify millions of craters and help draw a map of the Red Planet. There is open source publishing: With Bruce Perens, who helped define open source software in the '90s, Prentice Hall is publishing a series of computer books open to any use, modification, or redistribution, with readers' improvements considered for succeeding editions. There are library efforts like Project Gutenberg, which has already digitized more than 6,000 books, with hundreds of volunteers typing in, page by page, classics from Shakespeare to Stendhal; at the same time, a related project, Distributed Proofreading, deploys legions of copy editors to make sure the Gutenberg texts are correct. There are open source projects in law and religion. There's even an open source cookbook."
"Of course, for all its novelty, open source isn't new. Dust off your Isaac Newton and you'll recognize the same ideals of sharing scientific methods and results in the late 1600s (dig deeper and you can follow the vein all the way back to Ptolemy, circa AD 150). Or roll up your sleeves and see the same ethic in Amish barn raising, a tradition that dates to the early 18th century. Or read its roots, as many have, in the creation of the Oxford English Dictionary, the 19th-century project where a network of far-flung etymologists built the world's greatest dictionary by mail. Or trace its outline in the Human Genome Project, the distributed gene-mapping effort that began just a year before Torvalds planted the seeds of his OS."
"The entire ideology of information technology for the last 50 years has been that more information is better, that mass producing information is better," he [Jakob Nielsen] says.
If you are a company somehow related to the management and manipulation of information, certainly more information is better. However, this does not say much about the quality of life, and not much about the quality of information.
"The fix for information pollution is not complex, but is about taking back control your computer has over you."
This is a very profound philosophical statement; certainly not everyone believes that there is a control we have to take from the computers. Just how do we go about tacking back the control anyway? I'm not saying that this is not possible, it is just now easy due to many factors, and one of them being that not everyone believes there is a control to be taken back. As in any solution to a potential problem, one of the most important things in the process of discovering the solution is the ability to diagnose the problem properly. In the case of the information pollution, contextually diagnosing the root of the problem might turn out to be the hardest task.
ESCHEWING MOONBEAMS, BERNERS-LEE STICKS TO HIS KNITTING
(ShelfLife, No. 127 (October 9 2003))
Quote: "Asked by a BBC interviewer whether it's a "stupid fear" to worry that the Internet will become a giant brain, World Wide Web creator Tim Berners-Lee replied: "Computers will become so powerful and there will be so many of them with so much storage that they will in fact be more powerful or as powerful as a brain and will be able to write a program which is a big brain. And I think philosophically you can argue about it and spiritually you can argue about it, and I think in fact that may be true that you can make something as powerful as the brain, really whether you can make the algorithms to make it work like a brain is something else. But that is a long way off and in fact that's not very meaningful for now at all. All I'm looking for now is just interoperability for data." (BBC News 25 Sep 2003)
"The body of scientific and technical data and information (STI)* in the public domain is massive and has contributed broadly to the economic, social, and intellectual vibrancy of our nation. The “public domain” may be defined in legal terms as sources and types of data and information whose uses are not restricted by statutory intellectual property laws or by other legal regimes, and that are accordingly available to the public for use without authorization. In recent years, however, there have been growing legal, economic, and technological pressures that restrict the creation and availability of public-domain information—scientific and otherwise. It is therefore important to review the role, value, and limits on public domain STI."
"In a white paper Neil McLean and Clyfford Lynch try to give an overview of the many problems that arise when the educational world meets the library world, just after both have met the ICT world. Basically, they say that both worlds don't know each other. Sadly, this paper will not change that."
Nice observations and critique.
(courtesy of Information Literacy Weblog)
"There is a short article in the October 2003 Library and information update A new kind of worker. It is written by three people from the UK information consultancy TFPL. It highlights some of the benefits and challenges of embedding information literacy in the workplace, and uses TFPL's "Find; Organise; Create; Use; Share; and Value" approach to comment on current developments. It also mentions this weblog ;-)"
Reference: Winterman, V., Skelton, V. and Abell, A. (2003) "A new kind of worker." Library and information update, 2 (10), 38-39. http://www.cilip.org.uk/update/issues/oct03/article4oct.html
By far one of the best argued positions explaining the paradoxes and stupidities of SCO's claim that they 'own' Linux.
An open-source letter by Joe Firmage, a former vice president of strategy for Novell's Network Systems Group:
"OK, Sontag, fine. If you cannot inadvertently or accidentally assign your copyright, then there should be no problem in identifying exactly which portions of Linux allegedly violate SCO's rights. Simply issue a statement that identifies the offending code, stating clearly that the identification does not represent a release of rights into open source."
"The model of open science is "communistic" in the sense of community ownership--or rather community stewardship. But innumerable highly successful organizations and institutions in America are founded upon the ideal of community stewardship--including our democracy itself.
The downfall of communism was due to state control by totalitarians--an attribute embodied by today’s commercial software industry far more than by the emergent open-source science of information technology. "
The Massachusetts Institute of Technology is making its course materials available to the world for free download
"One year after the launch of its pilot program, MIT on Monday night quietly published everything from class syllabuses to lecture videos for 500 courses through its OpenCourseWare initiative, an ambitious project it hopes will spark a Web-based revolution in the way universities share information."
Let's see how far (in time and space) this ‘revolution’ will reach! Maybe, if each school does not have to (re)create the course materials from scratch, the tuition will go down! :) Or maybe someone will be making more money.
Nevertheless, in terms of information and/or knowledge sharing there ought not to be any doubt that this is a step in the right direction. Hopefully, the potentials can be utilized to benefit the society in general.
"Sharp divisions over how to bridge the digital divide between rich and poor have emerged ahead of a UN summit on the issue in December."
No wonder... with the presence of representatives from the private sector who ultimately care about their bottom line (i.e. $$$), it can hardly be expected that much will be achieved in terms of equality to information access. This sort of exercises lead nowhere unless there is a long stick that the ITU can use to implement the promoted initiatives, to even modestly tilt the balance of access to information.
"African nations have been rallying behind a proposal from Senegal to set up a new 'digital solidarity fund'"
"Many industrialised nations are wary of creating a new UN fund. Instead they favour encouraging investment by private companies and re-directing existing aid."
It appears that the issue of control and profits is the sticky point. So, the question does not seem to be as weather the developing countries should be 'helped' with advanced information technology. See my entry the seriousness of equal access to information for all - Information Summit where I've tried to present my concerns.
Academia Urged To Offer Library Services To Graduates in ShelfLife, No. 125 (September 25 2003):
"Today's college and university students graduate expecting, even demanding, to have continued access to the kinds of information-rich facilities they grew accustomed to and relied on during their student days. So says Clifford Lynch, executive director of the Coalition for Networked Information (CNI), who argues that more must be done to accommodate these expectations. Lynch notes that the transition from an information service within higher education to one broadly available to the public is not always simple or quick. For example, there was a gap of some years between when college and university graduates first started creating demand for the Internet and when the commercial market place was prepared to service this demand, particularly at reasonable prices. Currently the demand for information services focuses on content rather than computation and communication, creating a market for the licensed, proprietary digital content that schools do not own but pay licensing fees for under contract with the publishers and other service providers who hold the rights to the content. Because many suppliers are not set up to license to individuals or want to charge absurd prices, libraries, both public and academic, represent a potential resource to serve both their graduates and the public at large. Lynch suggests that higher education institutions and their faculty have an obligation to put on their agenda the issue of making their information services available beyond their academies' walls. (Educause Review Sep/Oct 2003) http://www.educause.edu/ir/library/pdf/erm0356.pdf"
"All of us have suffered the consequences of poor-quality information. For most of us, most of the time, the impact has minor significance and is of short duration. Perhaps we missed a bus or flight connection as a result of using an out-of-date timetable, or we lost an insurance claim because we failed to note the change in exemptions to the policy when we last renewed. As frustrating or painful as these examples may be, they are rarely fatal. However, in a small percentage of cases, poor quality information has direct, devastating consequences. For example, many of the arguments concerning personal privacy are based on the knowledge that an adverse comment on a person's reputation perpetuates itself, even after a formal retraction is published or a libel case is won. Some sorts of information are more "sticky" than others. Just as the garden weeds are more robust than the desired plants, bad information rears its ugly head more virulently than good information."
As I was attempting to identify few queries (for the TA class I assist the professor) that will result in URLs returned that would have different relevance depending on the user (needs, interests, etc...), I tried to search for the word 'syntax' (in Google) due to its multiple meanings, especially as it relates to the natural language and computer language. The idea was to show that the returned search results have different relevances depending if the search was instigated due to your interest in natural language or computer language.
The results were really surprising! The first 40 or so results were almost exclusively about the syntax of computer languages or some other system syntax. The syntax of the natural language was absent altogether!
Should we be concerned with this? I think so. It is unreal and untrue that the word 'syntax' (as an example) is related only to computers and systems. How would middle school or elementary school children react to these results when searching for the English language syntax?
I've taken the word 'syntax' as an example. There are probably many other words and phrases that search engines provide biased results for, intentionally or not.
Has the word syntax lost its meaning as it is related to natural language? At least this is what the search in Google might suggest to those that rely on learning about what they don't know via searching the web.
In this scenario, Google search results seem to be mediating the meaning of the word 'syntax' and many other words and phrases. It would be interesting to understand why google's search results are biased in favor of computer and systems related terminology, when there are tons of natural language syntax resources on the web.
Should we be concerned over search engines' meaning mediation power about things that affect us in our daily or professional lives?
A series of thoughts on knowledge management: KM: what's in it for me?. Worth reflecting on: "Social networking on the internet is beyond the communities of practice phenomenon, since the former is initiated and driven by the individual, and the opportunities for networking are more flexible, dynamic and fluid than communities of practice."
Towards a European Framework for the Re-use of Public Sector Information: a Long and Winding Road
by Katleen Janssen and Jos Dumortier
"Information owned by public sector bodies has, next to democratic importance, a considerable economic value for the industry in general and the information industry in particular. Since the 1980s, the European Commission has tried to stimulate the public sector to make its information available for re-use. In June 2002, it finally presented a proposal for a directive on this subject. This article gives an overview of events and documents leading to this proposal and attempts to make an assessment of the proposal. It is updated until 1 February, 2003."
"At the opening of the third preparatory meeting for the summit in Geneva, Leuenberger set out his recommendations before more than 1,900 representatives from 143 nations, the private sector and non-governmental organisations. Leuenberger added that the main bone of contention was finding ways to finance the summit initiatives and he urged the participating nations to present more concrete ideas by September 26, the last day of the prep talks."
"The three-day summit, which kicks off in Geneva on December 10, hopes to develop an action plan to provide equal access to information for all people around the world."
The initiatives for equal access to information for all the people around the world are to be admired at least for recognizing the importance of access to information in today’s information society (or better said society relaying so much on information exchange).
However, with the presence of representatives from the private sector who ultimately care about their bottom line (i.e. $$$), it can hardly be expected that much will be achieved in terms of equality to information access. This sort of exercises lead nowhere unless there is a long stick that the ITU can use to implement the promoted initiatives, to even modestly tilt the balance of access to information.
What usually happens in such meetings though is that the private sector that controls the means of access as well as the information itself is unwilling to give up some of its power. So, what ends up happening is that the current private-sector players join forces with local private sector players around the world, as if that means equal access. The private sector is interested about the bottom line whether it is in the developed countries or in the developing countries. So, instead of equal access to information for all, the current private sector players extend their control of access to information even further, paradoxically via the vehicles (such as this summit) that were supposed to enable the equal access.
What is a possible solution? Perhaps the state representatives to the Information Summit need to change their policies in terms of access to access technologies and information. These types of summits are good, but ultimately the mains responsibilities reside with the states themselves, with NGOs playing an important role in pushing their governments to enact 'fair' policies regarding access technologies and access to information.
In What's a good learning culture? George presents a very informative and interesting personal experience about satisfying information seeking needs.
Apart from the fact that "information need" seems to be used interchangeably with/for "need for knowledge" (I'm of the opinion that information does not equal knowledge, and perhaps as such the processes to satisfying information needs would differ from those for satisfying knowledge needs), I agree with George that informal means of seeking information have become part of our lives indeed.
In what George has written, few parameters emerge: structured vs. unstructured content, structured vs. unstructured communication (for content delivery), formal vs. information contexts.
Depending on the particular information need at hand, some combinations of the above parameters is applied in the process of information seeking. If we are to identify the tools that help us carry the information seeking process, a distinction will be apparent. For example, e-mail communication is not a structured content. One to one e-mail communication does not appear structured and yet there might be an underlying communication structure (not necessarily apparent) because of the common background between the participants. On the other side, many-to-many communication (i.e. discussion lists) may presents a semi-structured communication process and semi-formal context, depending on how the discussion is run (moderated, semi-moderated, etc.).
Models of Collaboration presents, informs and suggests five models of collaboration around the contexts/situation of: Library, Solicitation, Team, Community, and Process Support.
"In this guest editorial we examine five models for collaboration that vary from barely interactive to intensely interactive. Granted the CS definition for collaboration requires some level of interaction by two or more people, and in the past we have said that reciprocal data access (such as you would find in a library or repository) is not collaboration, we have also said that technology, content and process are critical for any type of collaboration. This being the case we are expanding our definition of collaboration (slightly) to include content libraries as most of the vendors in this area have added collaborative functionality. In addition, content is often critical for a collaborate interaction to occur…" - David Coleman
(Found the link via eLearnspace.org entry)
"This bibliography presents selected English-language articles, books, and other printed and electronic sources that are useful in understanding scholarly electronic publishing efforts on the Internet. Most sources have been published between 1990 and the present; however, a limited number of key sources published prior to 1990 are also included. Where possible, links are provided to sources that are freely available on the Internet."
Check the TOC.
"Scholarly communication is the system through which research and other scholarly writings are created, evaluated for quality, disseminated to the scholarly community, and preserved for future use. The system includes both formal means of communication, such as publication in peer-reviewed journals, and informal channels, such as electronic listservs. This document addresses issues related primarily to the formal system of scholarly communication."
"The European network for Information Literacy (EnIL) aims at opening a discourse on Information Literacy at European level, in order to promoting the establishment of a Culture of Information in Europe."
"A collaborative Digital Library is a user-centered system. In addition to the traditional purpose of providing resource discovery services, the system might also provide specialized services for some classes of users, ranging from basic alerting and selective dissemination services to complex, virtual community working spaces. In this sense the Digital Library represents a special workspace for a particular community, not only for search and access but also for the process, workflow management, information exchange, and distributed work group communications. But most digital library models are based on non-digital environments. As a result, the perceptions of users and the roles they play are biased by traditional views, which might not be automatically transferable to the digital world. Nor are they appropriate for some new emerging environments. New models are challenging traditional approaches. In many cases they redefine the roles of actors, and even introduce new roles that previously did not exist or were not performed by the same type of actor. With no means of formal expression, it is difficult to understand objectively the key actor/role issues that arise in isolated Digital Library cases, or to perform comparative analysis between different cases. This directly affects how the Technical Problem Areas identified by the June 2001 DELOS/NSF Network of Excellence brainstorming report will be addressed. The report states that the highest-level component of a Digital Library system is related to the system's usage. By understanding the various actors, roles, and relationships, digital libraries will improve their ability to enable optimal user experiences, provide support to actors in their use of Digital Library services, and ultimately ensure that the information is delivered or accessed using the most effective means possible. (Report, DELOS/NSF Working Group, 13 June 2003)"
Often we hear about or read headlines of articles claiming to report about machines that think or computers that can understand and reason. In each instance such information ought to be taken with skepticism.
"Over the past five years, a team led by Sandia cognitive psychologist Chris Forsythe has been working on creating intelligent machines: computers that can accurately infer intent, remember prior experiences with users, and allow users to call upon simulated experts to help them analyze problems and make decisions."
Infer intent, remember experiences.... yet, the rest of the article only reports on rules and patterns that are far from any type of thinking, reasoning, or understanding.
Nevertheless, the following quote is an attempt at the right direction, stressing that cognitive entities (such as humans) can interact intelligently because they each know something about each other or have some common/shared background that enables contextualization and understanding:
"When two humans interact, two (hopefully) cognitive entities are communicating. As cognitive entities -- thinking beings -- each has some sense of what the other knows and does not know. They may have shared past experiences that they can use to put current events in context; they might recognize each other's particular sensitivities."
So, how does one build a cognitive entity in its true sense, or perhaps approximate cognitive entity? Is it appropriate to even call a machine a cognitive entity by attaching the same connotation of the cognitivity as it pertains to humans?
I've raised similar issues in a previous entry why machines can't reason or think. The reason why the efforts of AI (artificial intelligence) so far have proven unsatisfactory in emulating the human reasoning and thinking process might have to do with the very fact that so far the approaches have been only mechanistic, thus incompatible with the very nature of the human experience and with the human mind in particular. So, we want computers to think intelligently, reason, learn, and think, and yet we apply mechanistic approaches to attempt to achieve these functions which require intellect?
Definition of information design from InfoDesign:
"Information Design is the intentional process in which information related to a domain is transformed in order to obtain an understandable representation of that domain." [Peter J. Bogaards, 1994]
STC Information Design SIG
Information Design Theory - A representation in the making
INFORMATION DESIGN ATELIER - R&D in information theory
Information Design and Technology program at Georgia Tech
Information Design - Tech Head Stories A nice collection of resources
organic information design [A Thesis]
I just came across an article regarding the concept of singularity as it pertains to society and technology. The article (Exploring the 'Singularity') goes to lengthy details to explain the concept of singularity, what it means, and sort of why is it 'inevitable'.
The predominant framework of the article relies on the belief that there is (or there will be) such existential state, a tangible reality, where machine intelligence is a possibility. This 'understanding' then leads to the belief that technology 'has life of its own'.
The article provides the following brief and succinct definition(s):
"Kurzweil and many transhumanists define it as "a future time when societal, scientific, and economic change is so fast we cannot even imagine what will happen from our present perspective." "
that will result when the machine intelligence surpasses the human intelligence.
"A number of scientists believe machine intelligence will surpass human intelligence within a few decades, leading to what's come to be called the Singularity."
As I have elaborated in another entry (why machines can't reason or think) the word 'intelligence' has two distinct meanings when applied to humans and to machines. Our intelligibility is a reality that we experience, we feel it, and we manifest intelligent actions. Now, if there is to be a machine intelligence in its true meaning, it is us humans that will have to implement it or as will singularists say 'turn the switch' to that machine intelligibility.
Short of using the argument that how can an intelligence create intelligent form more intelligent than itself, we should not forget that machines will definitely become more powerful and more capable in their information processing functions. But, are we ready to call this intelligence? Furthermore, the fast pace of technological development and advance will certainly come. However, how is this related to machine intelligence as stated in this article?
If anything, singularly should better refer to the future time when our human activities are fundamentally dependant and conditioned by the technology that surrounds us. We saw this sort of mass behavior with the Y2K bug. It had nothing to do with machine intelligence. Actually, one can argue that it very much had to do with human mis-intelligence in depending so much on technology even for the most critical daily life necessities.
Maybe it is time to start thinking on how to better utilize the technology around us, or perhaps to how better design the technology that will surrounds us, in such a way that minimizes the possibility of chaos due to overdependence on information technology.
Technology is what we make it. Yes, the appropriation of technologies influences us, our human society, and our activities. This influence, inscribed into the technology by us the humans, might prove to be negative and appear controlling at some instances, maybe with devastating consequences and relative chaos. However, this should not be confused with machine intelligence. We didn't create our intelligence. How can we create machine intelligence (or artifical intelligence) at all, let alone intelligence more intelligent than our own as the concept of singularity suggest?
""Our ultimate goal is to build a new generation of computer systems that are substantially more robust, secure, helpful, long-lasting and adaptive to their users and tasks. These systems will need to reason, learn and respond intelligently to things they've never encountered before," said Ron Brachman, the recently installed chief of Darpa's Information Processing Technology Office, or IPTO."
An example of what IPTO/PAL might do:
"If people keep missing conferences during rush hour, PAL should learn to schedule meetings when traffic isn't as thick. If PAL's boss keeps sending angry notes to spammers, the software secretary eventually should just start flaming on its own."
and, this is supposed to be achieved through a proposed technique(s) called REAL-WORLD REASONING based on these three concepts: 1) High-performance reasoning techniques, 2) Expanding the breadth of reasoning and hybrid methods, and 3) Embedded reasoners for active knowledge bases.
Now, in any dictionary, the word 'reason' has to do with mental states, analytic thought, logical deductions and inductions, etc, all of which come around to depend on the thinking process, which is a mental state that ultimately has to do with the human mind. If we are to agree that the human mind is a manifestation of the electro-mechanical-biological human brain, than the approach of rules and logical entities interconnected amongst themselves might some day bring about a machine that might 'act' as the human mind.
However, what is most interesting, it does not seem that there have been any attempts to look at the human process of 'reasoning' and 'thinking' from an angle different than the electro-mechanical-biological viewpoint. A brief reading of the REAL-WORLD REASONING proposal does not reveal any new insights except that it proposes another way based on the information-processing understanding of the information where bits of information are manipulated using relevance judgment for ‘aboutness’ assessment. Perhaps the issue or relevance as used over the past few decades needs to be reassessed?
So, what is uniquely different with the REAL-WORLD REASONING proposal?
The reason why the efforts of AI (artificial intelligence) so far have proven unsatisfactory in emulating the human reasoning and thinking process might have to do with the very fact that so far the approaches have been only mechanistic, thus incompatible with the very nature of the human experience and with the human mind in particular. So, we want computers to think intelligently, reason, learn, and think, and yet we apply mechanistic approaches to attempt to achieve these functions which require intellect?
It would be nice to hear if anyone has come around to know of an effort, practical or theoretical, that attacks the issues of machine 'thinking' and 'reasoning' from a perspective fundamentally different than the information-as-thing understanding (i.e mechanistic). Anyone?
(courtesy of ShelfLife, No. 116 (July 24 2003))
"Libraries are collaborative by nature, sharing expertise, staff and ideas. Shared cataloguing is a good example: a cataloguer in one library creates a record about a book for use in a central database rather than just his own system, and everyone else who contributes to that database can download that record into their local systems rather than re-doing it themselves.
Now librarians are talking about extending that collaboration and "deep sharing" digital content by creating a Distributed Online Digital Library. The DODL would depart from the status quo in terms of function, service, reuse of content and library interdependency. First, it would allow a common interface for distributed collections, rather than the widely divergent "looks" of today's linked collections. Second, and more radically, it would allow both librarians and end users to download digital master files as malleable objects for local recombinations. This means they could be enriched with content from librarians or teachers, specially crafted for particular audiences, and unified in appearance and function. A user could download, combine, search, annotate and wrap the results in a seamless digital library mix for others to experience. The services such deep sharing could provide are staggering, and the economics are just as attractive. Imagine 30 libraries coordinating to digitize their collections. Each funds individual parts of the project, but all equally share in the sum of their efforts. So for the cost of building one digital object and depositing it in the DODL, each library would gain 30 downloadable objects. As participation becomes more widespread, the equation becomes even more compelling. (Educause Review Jul/Aug 2003) http://www.educause.edu/ir/library/pdf/erm0348.pdf"
In Making Computers Understand Leslie Walker reports on an apparent innovation/invention suggesting that computers can understand and be aware of context. While the phraseology chosen might be a journalistic lingua franca to ‘spice’ the article, nevertheless, some claims by the company are rather troublesome:
“Abir, 46, claims to have unlocked the mystery of "context" in human language with a series of algorithms that enable computers to decipher the meaning of sentences -- a puzzle that has stumped scientists for decades.”
"This man literally has figured out the way the brain learns things," Klein said. "On a theoretical level, his insight basically is this: Understanding a concept is nothing more than viewing a concept from two different perspectives."
The very title of the article "Making Computers Understand" makes you immediately skeptical. Especially troublesome is the above quote stating that “This man literally has figured out the way the brain learns things”? Isn’t it perhaps premature to claim that we have discovered how the brain works with such certainty when the history has told us that many such claims in the past have been proven wrong by later discoveries and innovations?
Further, how does one prove that two different perspectives are sufficient to understanding a concept? I hope this does not mean that they believe two perspectives are necessary since there are ‘two sides of the same story’. Usually there are more then two sides to the same story and understanding the ‘reality’ and its context probably might take much more than two perspectives.
Besides, computers can’t decipher the meaning of a sentence as claimed in the article…
In It All Adds Up the notion of calculations as knowledge assets is presented as novel and unique process in KM:
"Specifically, MathSoft is promoting the idea of using its technology to facilitate what it calls calculation management—the practice of viewing engineering calculations as knowledge assets that should be managed and reused."
Aren't the folks at CIO magazine a bit late in their 'discovery'? Mathcad calculations put up on an Intranet for a use by a community of engineers are nothing more than scripts (or processes) for performing certain functions--to produce some sort of output(s) given the set of inputs. The open source movement has been doing this for how long? :)
Relatively speaking, for a corporate culture context where knowledge (in form of scripts/calculations here) is perhaps not easily shared by individuals due to fear of loosing some advantage, this could be considered a unique knowledge management practice.
"As stated above, democracy is not just a reformation of institutions (this is its final stage) but a reformation in the minds of the society on the whole, coming from its interior forces. If the socium is ripe enough to take over the responsibility to realise itself and its future, it means that the primary reform should happen in two spheres. First of all, in education: in breaking obsolete traditions in the minds, and educating citizens with creative, free mentality, capable of actively participating in discussions and comprehending social realities and tendencies impartially, without creating idealistic abstractions in the best sense of medieval utopias, having nothing in common with the realities of the present day life. Secondly, in the sphere of information, which should lay the foundation, the global field for working out social ideas, perception of present realities and their possible evolution. If the stress is not put on those two cornerstones of democracy then countries with either totalitarian or any other unnatural type will come forward attempting to hide by the democratic forms the negative aspects of life, or the developed democratic countries would mess around the external institutions and loose the real interest for reforms and, thus, would prepare good grounds for external, illusionary but very active social activity for the sake of the activity."
"the ability to find, evaluate, use, and communicate information in all of its various formats" - Work Group on Information Competence, Commission on Learning Resources and Instructional Technology (CLRIT), California State University (CSU) system. Information Competence in the CSU: A Report. Dec. 1995. http://www.csupomona.edu/~library/InfoComp/definition.html
A definition recommended by the Work Group is that information competence is the fusing or the integration of library literacy, computer literacy, media literacy, technological literacy, ethics, critical thinking, and communication skills.
Information Literacy: The ability to know when there is a need for information, to be able to identify, locate, evaluate, and effectively use that information for the issue or problem at hand.
In my previous papers (media & communication) I tried to show that the open source concept/phenomenon and its communicative elements are innovative ideas, giving rise to open communication technology, enabling the masses to communicate free from elite’s control, possibly acting as antidotes to hegemonic ideology. To do so, I applied the constitutive view of communication, suggesting that open source is enabler of ‘free dissemination’ and open communication.
Recognizing Ranganathan’s five laws of Library Science and their underlying concepts as powerful inspirations for social change, I would like analyze the open source software, as defined by the Open Source Initiative (OSI), and its congruency with the five laws. If the underlying concepts upon which the five laws are built had such profound impact on our society, then the proponents of the open source movement can learn a thing or two. The actual definition of open source software is a lengthy one; instead, a summarized definition from the OSI’s Frequently Asked Questions (FAQ) follows:
“Open source promotes software reliability and quality by supporting independent peer review and rapid evolution of source code. To be OSI certified the software must be distributed under a license that guarantees the right to read, redistribute, modify, and use the software freely” (The OSI).
A ‘book’ is the basic element of Ranaganathans laws: it contains objective knowledge. This calls for defining the comparative basic element of software development. Therefore, I will take the term ‘software’ to be the basic element: it contains objective knowledge. I have used the term ‘software’ loosely as it can mean a software product or software modules that can be used to build other software products. Respectively, the Five Laws of the ‘Software Library’ could be:
The First Law
Books are for use
(Ranganathan, p. 26)
Software is for use
The Second Law
Every readers his or
Every user his or her software
(or software is for all)
The Third Law
Every book its reader.
Every software its user
The Fourth Law
Save the time of the
Save the time of the user
The Fifth Law
Library is a growing organism
(Ranganathan, p. 326)
A software Library is a growing organism
Note: The American heritage Dictionary defines Library as it pertains to Computer Science in the following way: A collection of standard programs, routines, or subroutines, often related to a specific application, that are available for general use.
I think the experience cube denotes the fact that the movement from information to knowledge (an information-as-thing viewpoint of information) cannot be attained solely through the manipulation of information objects and/or knowledge artifacts. It shows the 'thing' necessary beyond what is represented in knowledge artifacts. The experience cube partially resides (or is part of) in Popper's World II.
At its basics, relevance is about matching the pertinent thing to an information need. It is established and evaluated by matching the representation of texts and representation of information need (Saracevic, p.6). As the basic and one of the most important concepts in information retrieval (IR) systems, relevance has evolved and in many ways led the evolution in the research, design, and development of IR systems. The concept of relevance has evolved from the system centric approach and into the more user centric approach floating in the discourse of the cognitive viewpoint of information science.
The intuitive understanding of relevance and aboutness has been rather recognized to be very complex, as Mizzaro and Saracevic have shown. In the system centric approach, “Relevance is considered to be a property of the system – it depends on how the system acquires, represents, organizes and matches texts, or in other words on the internal manipulation of the system” (Saracevic, p.6). With the move towards the cognitive viewpoint, research has elaborated on the various relevance attributes and the various manifestations relevance exhibits itself. So, which relevance should IR systems designers, developers and researchers deal with? The paradoxical answer is the relevant relevance at the appropriate level/dimension of manifestation. Based on the intuitive understanding of relevance, Saracevic derives that: “as a cognitive notion relevance involves an interactive, dynamic establishment of a relation by inference, with intentions toward a context” (Saracevic, p.5)
From the above it is evident that context matters. Relevance cannot be addresses without a context especially in relation to the interactive IR systems with the user(s) as the central element affecting multiple manifestation of relevance: “Relevance is a dynamic phenomena: For the same judge, a document may be relevant at a certain point of time and not relevant later, or vice versa” (Mizzaro, p. 814).
If there are multiple manifestations of relevance, is it feasible to identify relevance, a composite one, which perhaps can give us an insight into the relevance as it pertains to a particular situation and task? A challenge like this would perhaps require understanding the relation among the various manifestations of relevance.
Mizzaro, S. (1997). Relevance: the whole history. Journal of the American Society for Information Science, 48 (9), 810-832
Saracevic, T. (1996). Relevance reconsidered. Information science: Integration in perspectives. Proceedings of the Second Conference on Conceptions of Library and Information Science. Copenhagen (Denmark), 201-218
The most pervasive response to the question “what is it [something] about” in relation to an information object is the answer referring to the pertinent topic or theme as perceived by the individual who is responding. Very rarely the response would be answering the question about the methodology or the framework within which the information object was created. In case of a textual document a response could possibly refer to the methodology, but in most instances perhaps because methodology is the topical issue being covered in the document. Even when the response is regarding the topicality, it is hard to agree on the aboutness of a particular document with great certainty. Nevertheless, in communication with each other, humans intuitively understand and agree on what things are about and what do they relate to. The intuitive understanding of relevance by everyone seems to be closely related to the definition in many dictionaries as “…pertaining to the matter at hand” which people use without much thinking about it (Saracevic, 1996, p.3).
Considering the necessity to search for information and the potential resources that can satisfy the necessity, and its aboutness, Mizzaro suggests that “each relevance can be seen as a point in a fourth-dimensional space, the values of each of the four dimensions being: (i) Surrogate, document, information; (ii) query, request, information need, problem; (iii) topic, task, context, and each combination of them; and (iv) the various time instants from the arising of the problem until its solution” (Mizzaro, p, 812).
The dimension of aboutness (task, topic, and context) is rather incomplete in a sense that aboutness in relation to time could have been included, in addition to including time as the fourth dimension. The difference between time as a fourth dimension and time related to aboutness, is that time aboutness would give us relevance related to the passage of time.
For example, a document might be less relevant today in a certain organizational context compared to the earlier relevance it might have had, resulting from the fact that other documents appearing latter have superceded it; something like the induced difference in relevance judgments as a result of two points in time, and the additional difference in relevance when these two points in time are moved together to another time.
This could be considered different than the fourth dimension where the relation between the need for information, the resource to satisfy the need and its aboutness all three change in the way they are related at different points in time. One could argue however that the time aboutness is part of the context.
Nevertheless, I think time aboutness should be treated separately as is the task, the topics, and the context.
Mizzaro, S. (1997). Relevance: the whole history. Journal of the American Society for Information Science, 48 (9), 810-832
What are the challenges in the production and dissemination of IC (intellectual capital) statements and measures?
In identifying these challenges we need to perhaps look at few things:
a) what type of intellectual capital and knowledge are these statements and measures representing and meta-representing,
b) what is the intended use,
c) the role of dissemination channels and media type,
d) the role of the context.
The desired result would be to design meaningful and understandable intellectual capital statements and measures in a way that they would represents and transfer the most out of the ‘intangible world’ and into the ‘tangible world’, moderated by the context and the situation as well as the available channels and modes of dissemination.
The representation aspect of such statements is clearly emphasized by McInerney: “Although most information managers are not trained as journalists, a reporter’s skills of capturing, recording, and reporting new knowledge could be beneficial in the active process of finding out what an organization’s members know” (p. 1016).
One could argue that the representation stage is unnecessary in the case when spoken language is use to transfer ‘knowledge’. Even the spoken language though is form of representation of the intangible (short lived unless it is audio recorded or transcribed) and we clearly attempt to use the most appropriate words for representing concepts when sharing our thoughts with others.
McInerney, C. (2002). Knowledge Management and the Dynamic Nature of Knowledge. Journal of the American Society for Information Science and Technology, 53, (12) 1009-1018
“You can’t manage what you don’t know about” (Blair, p. 1027)
“Knowledge management is not an end in itself, it is a means to a further end” (Blair, p. 1028)
One of the most important aspects regarding knowledge management (KM), both as theoretical endeavor and practice, appears to pertain to the question what is it that is being managed. Or, we can better ask ourselves as to what do various authors mean when referring to KM. What’s in the name? In order to differentiate KM from information and data management it needs to be shown that knowledge is different than data and information. Blair’s (2002) explication that knowledge is different than data and information is based on the information theory stratification which puts data as the raw thing, then information which means data arranged in a certain way that presents and brings forth an obvious interpretable meaning, and then knowledge as the next level up, mainly stating that knowledge, exhibited through it characteristics, is different because it resides in peoples minds and it is not tangible (p. 1020). McInerney (2002) also presents the information theory viewpoint of knowledge: “in information theory, knowledge has been distinguished by its place on a hierarchical ladder that locates data on the bottom rung, the next belonging to information, then knowledge, and finally wisdom at the top” (p. 1010). It appears that this kind of placement of knowledge fits better with KM as practice since it distinguishes information-as-thing to be something tangible. If however we look as Brookes’s (1980) elaborations regarding ‘information’, he defines information as a "small bit of knowledge” and “knowledge as a structure of concepts linked by their relationship and information as a small part of such structure” (p. 131). There does not seem to be a necessity to explain why information is different than knowledge, for both Blair and McInerney could have proceeded with their arguments in the articles by showing that knowledge is not a tangible (in physical sense) thing. An argument for the necessity to differentiate knowledge from information in such terms appears to respond to a need to clearly and unambiguously distinguish knowledge management from information and document management (Blair, p. 1019), perhaps more so for KM practitioners.
One of the most important aspects regarding knowledge management (KM), both as theoretical endeavor and practice, appears to pertain to the question what is it that is being managed. Or, we can better ask ourselves as to what do various authors mean when referring to KM. What’s in the name?
In order to differentiate KM from information and data management it needs to be shown that knowledge is different than data and information. Blair’s (2002) explication that knowledge is different than data and information is based on the information theory stratification which puts data as the raw thing, then information which means data arranged in a certain way that presents and brings forth an obvious interpretable meaning, and then knowledge as the next level up, mainly stating that knowledge, exhibited through it characteristics, is different because it resides in peoples minds and it is not tangible (p. 1020). McInerney (2002) also presents the information theory viewpoint of knowledge: “in information theory, knowledge has been distinguished by its place on a hierarchical ladder that locates data on the bottom rung, the next belonging to information, then knowledge, and finally wisdom at the top” (p. 1010).
It appears that this kind of placement of knowledge fits better with KM as practice since it distinguishes information-as-thing to be something tangible. However, if we look at Brookes’s (1980) elaborations regarding ‘information’, he defines information as a "small bit of knowledge” and “knowledge as a structure of concepts linked by their relationship and information as a small part of such structure” (p. 131).
There does not seem to be a necessity to explain why information is different than knowledge, for both Blair and McInerney could have proceeded with their arguments in the articles by showing that knowledge is not a tangible (in physical sense) thing. An argument for the necessity to differentiate knowledge from information in such terms appears to respond to a need to clearly and unambiguously distinguish knowledge management from information and document management (Blair, p. 1019), perhaps more so for KM practitioners.
Blair, D.C. (2002). Knowledge Management: Hype, Hope, or Help? Journal of the American Society for Information Science and Technology, 53, (12) 1019-1028
Brookes, B.C. (1980). The foundation of information science. Part I. Philosophical aspects. Journal of Information Science 2, 125-133
McInerney, C. (2002). Knowledge Management and the Dynamic Nature of Knowledge. Journal of the American Society for Information Science and Technology, 53, (12) 1009-1018
The challenge related to track’s ambiguity is manifold considering that “in contrast to the knowledge of the present, that of the past is necessarily ‘indirect’” (Bloch, p.48). Bloch further explains that “by ‘indirect knowledge’ the methodologists have generally understood that which arrives at the mind of the historian only by way if other human minds” (p.53).
This process of knowledge ‘traveling’ through the human mind unquestionably involves the process of presentation and representation (besides for memorization of texts which ‘copies exactly’), usually in a written form. In this process there is a constant meaning making and interpretation (Bloch, p.187) of the available material. Also, words have different meaning in time and space, therefore historical research needs to account to the extent possible for the contextual meaning of the words ‘placed’ in the time and space (Bloch, p.163).
Bloch, M. (1953). The historian's craft by Marc Bloch. New York: Vintage Books.
“…we have no other device for returning through time except that which operates in our minds with the materials provided by past generations” (Bloch, 1953, p.57)
In this short, yet a very significant quote aimed at historical research, Bloch (1953) succinctly states at a high level the methodology appropriate and suitable for a historian to follow and presents a basic but significant tool a historian ought to use, as well it identifies the critical and basic investigative unit, i.e. “the materials provided by past generations” (p.57). Even though the quote is related to historical research aimed at researching history as it is understood and described by the field of history, i.e. “true historical research, or historiography or intellectual history, is concerned with analyzing and interpreting the meaning of historical events within their context” (Powell, 1997, p.166), Bloch’s description of historian’s craft is perhaps applicable and utilizable in many other fields that undertake the task to investigate and make sense of materials not immediately available in the spatial and temporal present as described by Powell: “History has two dimensions…. Historical time, or the chronology which takes into account the spacing of events and/or patterns…. [and] Historical space or where events occurred (i.e., geographical location)” (Powell, p.166).
From the viewpoint of historical research, especially interesting is Darnton’s (1990) elaboration on the history of books, or should I better say Darnton’s application of various historical research tools in his study of history of book with its aim “… to understand how ideas were transmitted through print and how exposure to the printed word affected the thought and behavior of mankind during the last five hundred years” (p. 107).
In this sense, it can be argued that historical research as artisan’s tool and methodology falls within the realm of information studies, or even information science, more so when utilized in analyzing and studying the movement and influence of ideas, thoughts, concepts and knowledge, represented at various levels of intentionality, manifested as temporal and spatial information dissemination via the printed press—containing representations of the products of the human mind.
It appears then that information acts as a carrier, a transmission channel on which ideas, thoughts, concepts, and knowledge ride.
Darnton, R. (1990). The Kiss of Lamourette. New York: W.W. Norton.
With respect to the Ranganathan's second law, "EVERY PERSON HIS OR HER BOOK” (OR BOOKS ARE FOR ALL) (p.81), a comparable enunciation would be EVERY PERSON/USER HIS OR HER DIGITAL INFORMATION OBJECT (OR DIGITAL INFORMATION OBJECTS ARE FOR ALL). Obviously, in the context of the digital library, this enunciation has far reaching consequences and implications in terms of legal issue such as copyrights, ownerships, freedom of speech, information democracy, etc.
However, an interesting implication is related to the aspect of information literacy or even better said digital information literacy. Given the multitude of digital information objects, even if it is possible and feasible to make available all digital information objects to all users (the obvious hard issue of relevance both research and practice related), it is hard to say whether the users will be able to ‘read’ and ‘understand’ the various digital information objects. We are all familiar how to read text as narrative. However, does every user know how to contextually read a chart, a bar graph, or a video presentation of unknown phenomena?
It appears that the information and medial literacy issues are lacking in the study of digital libraries. Marchionini indirectly raised the issue of technology vs. user in context: “The experience of this case [The Baltimore Learning Company] demonstrated that advanced technical solutions and high-quality content are not sufficient to initiate or sustain community in settings where day-to-day practice is strongly determined by personal, social and political constrains” (p.23).
Technology alone can’t fix problems.
Marchionini, G., Plaisant, C., & Komlodi, A. (in press) The people in digital libraries: Multifaceted approaches to assessing needs and impact. Chapter in Bishop, A. Buttenfield, B. & VanHouse, N. (Eds.) Digital library use: Social practice in design and evaluation. Retrieved October 26th, 2002 from: http://ils.unc.edu/~march/revision.pdf
Ranganathan, S. R. (1957). The five laws of library science. London: Blunt and Sons, Ltd. pp. 11-31, 80-87, 258-263, 287-291, 326-329
“Human-centered digital library design is particularly challenging because human information behavior is a complex and highly context dependent, and the digital library concept and technologies are rapidly changing” (Marchionini et al., p.1)
Digital libraries like many other unique conceptual and practical phenomena resulting from the information explosion have presented both the researchers and the practitioners alike with a challenge to understand its very complex and multifaceted nature. As with any emerging concept and practice, there is a struggle to define its scope and its contextual situatedness. All three articles in one way or another deal with the definition and the meaning of the term ‘digital library’, the social relevance, and its place in the information society amid the multitude of contexts it is imbedded, and its implication for research and practice.
Machlup and Mansfield (1983), aiming to “to analyze the logical (or methodological) and pragmatic relations among the disciplines and subject areas that are centered on information” (p. 3), look at the various sciences, disciplines, and fields of studies who directly or indirectly take information to be their subject of study. They present a well rounded argument and historical overview of the same. The aim is to explicate if possible the intersection (its nature and properties) of all those activities that deal with information directly or indirectly, given that “Information is not just one thing. It means different things to those who expound its characteristics, properties, elements, techniques, functions, dimensions, and connections” (Machlup et al., 4).
The preceding quote also suggests that the different meanings of information have paved the way for the emergence and divergence of many different sciences, disciplines and fields of study related to information.
To remedy this diffusion Machlup et al. suggest: “that most of the confusion caused by the use of the term information science in its broadest sense could be avoided by the addition of the plural s. The information sciences could then take place alongside the natural sciences, the social sciences, and other umbrella terms that indicate a grouping of disciplines and fields of study that share a common characteristics” (p19). This suggestion is novel one, perhaps one day we will be talking of the ‘school of information sciences’.
A general observation is that information science is science in making, not yet fully established as a ‘normal science’ in Kuhnian sense: “’normal science’ means research firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundation for its further practice” (Kuhn, 1970, p.108).
Also, various information problems treated by information science lack a coherent paradigmatic understanding and definition of the information phenomenon: “in the absence of paradigm or some candidate for paradigm, all of the facts that could possibly pertain to the development of a given science are likely to seem equally relevant” (Kuhn, p. 113). As such, the multitude of information problems are addressed by variety of methodologies, conceptually viewpoint, and some theories borrowed by information science practitioners from other social and natural sciences with which information science has interdisciplinary relations.
Kuhn, T. (1970). Chapter 2: The Route to Normal Science, Structure of Scientific Revolutions 2/E, 2(2) 10-22. University of Chicago Press
In the various discourses treating 'information' and information science, a very suggestive and potential understanding of the information phenomenon is surprisingly missing. The following quote by Brookes: “The artifacts which record human knowledge exosomatically become independent of the knowing subjects who created them. These artefacts are no longer subjective and inaccessible but objective and accessible to all who care to study them….” (p. 128) suggests that the various information and knowledge artifactcs contain within themselves objective information and knowledge. If these physical objects carry and transmit the symbols, isn’t it feasible then to think of ‘information’, somehow embedded with the symbols, as the conceptual channel for transmitting ideas, thought, concepts, knowledge? The suggestion of this thought or conceptualization of information would not have been justified if knowledge deposited in knowledge artifacts could not be considered “independent of the knowing subjects who created them”.
Brookes, B.C. (1980). The foundation of information science. Part I. Philosophical aspects. Journal of Information Science, 2, 125-133
This study analyses, compares and critiques a set of articles and writings that have treated and examined the ‘information’ phenomenon and the way various discourses and understandings of ‘information’ have been utilized in the field of information science and information studies. The methodological and theoretical foundations of the various understandings are discussed, not forgetting the effect of the context within which the various concepts and understanding of the ‘information’ phenomenon came into existence and use. In addition, an attempt is made to understand and trace the impact of the various understandings and concepts in their subsequent use within various practical and theoretical studies in information science, information studies, communication technology, and new media, as well as the role of information science and information related practices on the development of the understandings of ‘information’. The examination of the relevant literature shows a two sided aspect in the development of the information concept and information related disciplines (the science and the practice) as constantly informing each other over time: the understandings of ‘information’ posit questions to be answered by information science research and practice and visa versa, the information practice posits and instigates a need to properly understand information.
The following quote by Brookes presents a great challenge: “In other words, once human knowledge has been recorded [in World III], it attains a degree of permanence, an objectivity, an accessibility which is denied to the subjective knowledge of individual humans” (Brookes, p. 128). Most intriguing is about this statement is that knowledge, once recorder, attains a degree of permanence, objectiveness, and accessibility. Not quite sure if Brookes meant to say relative permanence, objectivity and accessibility bound by time and space. Otherwise, it would suggest that the recorded knowledge and information have an intrinsic property or characteristics or structures which can be detached and maintained in truly objective manner outside of the situation and the context it was created. If so, understanding these characteristic, structures, properties and manifestation could be the first steps towards the theory of information.
Brookes, B.C. (1980). The foundation of information science. Part I. Philosophical aspects. Journal of Information Science, 2, 125-133
In everyday life, the word ‘information’ is closely associated with the concept of communication, more specifically with the aspect of communication of ideas, thoughts, and knowledge, bringing forth an understanding of information that it has properties to convey ideas, thoughts, concepts and knowledge. But, how exactly is information conveyed? If information is conveyable, is it the process that helps convey understanding between two human beings, or is information the knowledge conveyed between two cognitive entities? These questions bring forth different understanding of the word information, as Machlup and Mansfield (1983) have succinctly capture it in the above quote, suggesting that information is not a thing that is simple to describe and explain. It is a phenomenon with multifaceted understanding, perhaps requiring multitude of methodologies and means of investigation and research. Buckland (1991) identifies three principal uses of the word information: 1) information-as-process (the ability to inform), 2) information-as-knowledge (the knowledge imparted in the process of being informed), and 3) information-as-thing (p.3), concentrating on the various properties of information and its different manifestations and understandings.
Machlup, F. & Mansfield, U. (1983). Cultural Diversity in Studies of Information. In F. Machlup and U. Mansfield (Eds.), The Study of Information. Wiley, 3-59
Buckland, M. (1991). Information and Information Systems. Chapters 1, 4, 5 & 6. New York: Preaeger
Skeptical Knowledge Management In: Hans-Christoph Hobohm (Ed.): Reader: Knowledge Management and Libraries. IFLA (International Federation of Library Associations and Institutions) Publication series, Munich: Saur 2003 (in print)
Stable Knowledge (2000) Paper presented at the Workshop: Knowledge for the Future - Wissen für die Zukunft, Brandenburgische Technische Universität Cottbus, Zentrum für Technik und Gesellschaft, March 19-21, 1997.
In his visionary article “As We May Think”, it could be easily said that Vannevar Bush (1945) put in motion many of the concepts of various contemporary sciences. The most remarkable of all, as it interests the students of information objects and knowledge records, is his vision and vivid description of the memex and what it could do to capture the knowledge of individuals and make it available and accessible for generations to come (p.101), effectively recognizing the societal knowledge and its importance for the future of humankind.
Bush also presents a problem and challenge to the cognitive viewpoint of information science and also the cognitive science in general. His concept of association is most remarkable: “… however, to associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. The process of tying two items together is the important thing” (p.107). It took the practitioners of computer science almost half a century to implement this important concept in databases, resulting in relational databases, considered by many experts one of the most important innovations in the field of digital storage and access to data and knowledge records.
Bush, Vannevar (1945). As We May Think. Atlantic Monthly, 176, (11), 101-108
Buckland’s three senses of information, 1) information-as-process (the ability to inform), 2) information-as-knowledge (the knowledge imparted in the process of being informed), and 3) information-as-thing (p.3), are the most pervasive understanding of information in use by various disciplines, with information-as-thing perhaps most evidently effecting the understanding of information science research and practices so far.
“information-as-thing … is the only form of information with which information systems can deal directly” (Buckland, p.54).
“… [Knowledge/information] … is intangible. One cannot touch it or measure it in any direct way (Buckland, Ch.1)
“Therefore, to communicate them [knowledge, beliefs, and opinions], they have to be expressed, described, or represented in some physical way, as mark, signal, text, or communication.” (Buckland, Ch1)
“Since information and information handling is pervasive in human activities, an exploration of information systems that did not include the social, economic, and political context and the broad social role of information would be seriously incomplete” (Buckland, Ch1).
Buckland, M. (1991). Information and Information Systems. Chapters 1, 4, 5 & 6. New York: Preaeger
Before I started this class (194:601 – Fall 2001), I had an idea about what information is. Perhaps, I was more or less in tune with the technical view of information—something that can be measured—as result of my telecommunications and information technology background. I was pleasantly surprised to learn about the social aspect of information and its related phenomena, discovering that telecommunication and information technology are actually 'products' resulting from a multitude of problems treated in the information science domain. Brookes 'fundamental equitation of information science' K[S]+ΔI=K[S+ΔS] (Brookes, p. 131) is a very profound expression of human natural way of thinking and basis for treating various aspect of information related phenomena. Having defined information as a "small bit of knowledge” (p. 131), Brookes further explains his view of “knowledge as a structure of concepts linked by their relationship and information as a small part of such structure” (p. 131). Noting that “theoretical information science hardly yes exists” (p. 125), Brookes defines “the task of information science … as the exploration of this world of objective knowledge which is an extension of, but is distinct from, the world of documentation and librarianship” (p. 125).
'Objective knowledge' is the main concept around which Brookes' fundamental equitation operates, situated in Popper’s World 3: “He [Popper] recognizes a third world, that of objective knowledge which is the totality of all human thought embodied in human artifacts, as in documents of course, but also in music the arts, the technologies. These artifacts enshrine what Popper declares to be his autonomous—or near autonomous—world of objective knowledge” (p. 127). The other two Popper worlds are the physical world (World 1) and “World 2, the world of subjective mental states, [which] is occupied by our thoughts and mental images….” (p. 129).
Before I started the Ph.D. program here at SCILS, I thought I had a clear idea about what information is, just to discover that “what is information?” seem to be a classical and still unanswered question. Brookes’s elaboration of Popper’s World I, II, and III from the information and knowledge conceptual realms presents refreshing thoughtfulness about the concepts of information and knowledge. Brookes’s 'fundamental equitation of information science' K[S]+ΔI=K[S+ΔS] (Brookes, p. 131) is a very profound expression of human and natural way of thinking and a basis for treating various aspect of information related phenomena. Having defined information as a "small bit of knowledge” (p. 131), Brookes further explains his view of “knowledge as a structure of concepts linked by their relationship and information as a small part of such structure” (p. 131). Noting that “theoretical information science hardly yes exists” (p. 125), Brookes defines “the task of information science … as the exploration of this world of objective knowledge which is an extension of, but is distinct from, the world of documentation and librarianship” (p. 125).
'Objective knowledge' is the main concept around which Brookes' fundamental equitation operates, situated in Popper’s World 3: “He [Popper] recognizes a third world, that of objective knowledge which is the totality of all human thought embodied in human artifacts, as in documents of course, but also in music, the arts, the technologies. These artifacts enshrine what Popper declares to be his autonomous—or near autonomous—world of objective knowledge” (p. 127). The other two Popper worlds are the physical world (World 1) and “World 2, the world of subjective mental states, [which] is occupied by our thoughts and mental images…” (p. 129).
In attempt to identify what information science ought to do, Brookes recognizes that “documents and knowledge are not identical entities” (p. 127), and differentiates between practical and theoretical information science: “the practical work of library and information scientists can now be said to collect and organize for use the records of World 3. And the theoretical task is to study the interactions between Worlds 2 and 3, to describe them and explain them if they can and so to help in organizing knowledge rather than documents for more effective use” (p. 128-9)
World 1 = the physical world
World 2 = the world of subjective mental states occupied by our thoughts and mental images
World 3 = the world of objective knowledge which is the totality of all human thought embodied in human artifacts, as in documents of course, but also in music, the arts, the technologies
Brookes, B.C. (1980). The foundation of information science. Part I. Philosophical aspects. Journal of Information Science 2, 125-133
In Belkin, Oddy, and Brooks (1982a) the subject of interest is the ability to represent and classify user’s ASKs (anomalous state of knowledge), i.e. identifying, representing and classifying that which is not known, that which makes a user initiate information-seeking behavior. In a sense, representations and classifications of users ASKs directly or indirectly ought to lead to improvements of information-seeking and information use. Apart from the apparent improvements of information retrieval (IR) systems, models and techniques, classification of information needs (as represented through the problem statement which help derive the ASK representation) could potentially help individual users and especially the scientists in the scholarly community to quickly identify and classify their information needs in such a way that will improve their information-seeking activities in relation to research issues which demand use of multitude of information types and sources when working on a research problem. In this paper I would like to elaborate the implication of the ASK identification and classification concepts and its applicability and benefit in the process of information-seeking and its use by members of the scholarly community when working to identify a research problem. This seems to fall within the scope of ‘user studies’ as explained by Wilson (1994): “I take starting point of ‘user studies’ to be the individual information user who, in response to some perceived ‘need’, engages in information-seeking behavior” (p.16). Here I’m not interested with the technological system level, rather with the social structures that manifest themselves as sources of information that aid a scientist in finding the pertinent information. Examples of such social structures as information sources from which a researcher can benefit would be: journals, books, libraries, invisible colleges, colleagues at schools, professors, specialized discussion lists, conferences, colloquiums, etc.
Considering this week’s topic on the historical overview of information science and librarianship, I indulged myself into the readings with the aim of clarifying for myself what is that actually we are studying in information science, and perhaps more importantly looking for a succinct definition such that I can quickly and easily explain to friends and family the aim, scope and subject of my field of study. Further, given my traditional education as electrical engineer (bachelors) and more specifically telecommunication engineer (masters) within the realms of the electrical engineering field, I was looking for discourses attempting to decompose the concept or the thing we easily refer to as ‘information’ into its more elementary parts. This assumes that information might be decomposable, its nature can be examined, and that it can help to systematically treat the problems claimed to be in the realm of information science and librarianship.
The book chapter by Vakkari (1994) represents a comprehensive, informative, and critical overview of the historical directions of information studies and librarianship, its interdisciplinary nature, and past and present problems treated by the field. Vakkari’s critical framework of analysis attempts to locate information science and librarianship within the domains of the already established sciences or fields of studies, thus he borrows concepts and tools for analysis from sociology and philosophy of science (p.3). The interdisciplinary nature of information science (for Vakkari librarianship is one with information science) clearly surfaces from his analysis.
“Information is not just one thing. It means different things to those who expound its characteristics, properties, elements, techniques, functions, dimensions, and connections.” (Machlup and Mansfield, p. 4)
Machlup, F. & Mansfield, U. (1983). Cultural Diversity in Studies of Information. In F. Machlup and U. Mansfield (Eds.), The Study of Information. Wiley, 3-59
Newby's second area of emphasis for the long-term goal of exosomatic memory "to enable personalized relations to representations of data sets (as opposed to 'one size fits all')" (p. 1028) I believe should play more important factor in designing information systems. My suggestion is that in order to have and enable personalized relations to representations of data sets, it is not necessary to have similarity and consistency between the information space and cognitive space. Rather, information spaces should be built with generality of use in mind, stressing on efficiency and performance, perhaps matching a generic human cognitive space. The emphasis should be placed towards designing the interfaces that will help present the generic information space as unique and conceptually similar and consistent with user’s cognitive space. The representation interfaces will be designed particularly to user's cognitive space in mind with the ability to interface with multiple information spaces.
The advantage could be twofold: a) generic information systems and their corresponding information spaces will have wider use and utility since they are not designed for a particular cognitive space; b) human users will be able to tap into multitude of information spaces with easy, without the need for each system to be designed with their cognitive space in mind. What I have suggested could lead to decoupling of the representation interface from the computerized representation of data set in information systems.
Once this decoupling occurs, ideally, various specialized information spaces/systems can be used in distributed cognition processes. Considering that in distributed cognition "the central unit of analysis is the functional system, which essentially is a collection of individuals and artefacts and their relations to each other in a particular work practice" (Rogers, p. 122) and that "… the focus is on the way in which knowledge is transmitted between team members and on how information is propagated through and across the artefacts" (p. 122), the decoupling can aid in designing and building artefacts open to various representation interfaces, capable to tap into multitude of information spaces.
Such congruence between artefacts with more generic information space (could be domain particular) and representation interfaces capable to tap into multitude of information spaces, and yet representing the information and knowledge in a form compatible with the cognitive space of the particular user, could yield in designing and building effective and efficient distributed cognition environments.
Newby, G. B. (2001). Cognitive Space and Information Space. Journal of the American Society of Information Science and Technology, 52, 1026-1048.
Rogers, Y. & Ellis, J. (1994). Distributed Cognition: An Alternative Framework for Analyzing and Explaining Collaborative Working. Journal of Information Technology, 9, 119-128.
It is the confrontation between the book merchants—who saw the prohibition of heretic books as threat to their business (Febure et al, p.304)—and reformers on one side, and the authorities of religious, political, social and social institutions on the other, we see reflected by Mill (1921). Mill suggests that it is the confrontation of one’s opinions and ideas via open and free discussion, free from governmental oversight and censorship that leads to the advancements and progress in human society. In the case of the book, it is the ‘heretic’ book itself—the other opinion, which through a very long struggle brought about the opinion to be freely expressed, free from censorship. Fabure & Martin (1976) suggest that it is this exchange of diverse opinions via the books and other printed material, rather tragic in many instances in various periods in human history, that helped and fueled the ‘coming of the book’ as an artifact of daily life (p.108).
Whatever the ways in which the book was used and by whom, analogous to other technological advance in the human history which have been use to benefit the human society as well as wrack havoc, an undeniable benefit will be permanently associated with the printed book: its ability to keep records of information and representations of human knowledge, making them available through space and time, thus acting at distance as an artifact for social change. This is book’s double role as a statement/representation of social and individual knowledge, and as an actor or agency acting upon the same.
Fabure, L. and Martin, H.A. (1976). Book as a Force for Change. In the Coming of the Book, N.L.B., 248-332
Mill, John Stuart. (1921). On Liberty. Atlantic Monthly Press, 59-111
The concepts brought forth by the actor-network theory are so pervasive in our daily lifes that we utilize them without acknowledging the aforementioned scholarly freameworks.
For example, we constantly try to convince our friends to come and see a movie with us, or advice them to take a certain class we found beneficial. We do this without draining out brains about each and every detail of why we acted in a particular manner. Perhaps it is this pervasiveness in daily life encounters that when reading and learning about as actor-network such concepts one does not necessarily find ‘new’ things besides the fact that they have enables us to engage in scholarly discourse, presenting and structuring our thoughts, ideas and opinions in ways to make them easily inscribable (both in people's minds and as exosomatic memory artifacts) such that they perform at distance across time and space. In this endeavor we do not stand as isolated individuals, we are also performed upon.
Both the actor-network theory and ANT have acted as inscription and translation tools in the process of writing warious class papers, seemingly to act at distance for some time to come in my scholarly training. These statement regarding inscription and performation are circular in nature (we learn but that learning affects how we learn in the future), perhaps letting us know that we are not isolated; we constantly perform and are performed upon.
In their presentation of historical accounts around and about the book right after the printed press become feasible for mass use, Fabure and Martin (1976) argue that business decisions about profitability played crucial role for spreading the book and making it widely used—speaking in relative terms. A point not explicitly raised and elaborated in this particular chapter, however, leads to a need for explication that profit-making ventures could not have been solely responsible for the dramatic change that took place in the wide acceptance of the book. A favorable interplay of social, political, and cultural factors was a necessary ingredient for merchants of the book to be successful in their ventures. One could argue that this favorable atmosphere came about because of necessary historical forces in line with the concept of the progressive human evolution, where merchants and profit-minded people sized the opportunity to enrich themselves utilizing this new phenomenon. In this short paper, I argue that the merchants of the book, together with reformists like Luther and Calvin played a crucial role in bringing the book to the masses. On one side, merchants saw profitability with the increased readership. On the other, Luther and Calvin envisioned the book (or any printed material for that matter) as an agent for social and political change. Various kings, monarchs, noblemen, religious authorities, and religious institutions that had no interest in changing the social and political structures of their dominions, jumped the bandwagon little late after having realized the powerful tool Luther and Calvin had at their disposal.
In the beginning phases of the ‘coming of the book’ where it slowly started to become an item in the daily lives of those who had access to it (those who could afford it and who could read), a functional analogy could be drawn with Ranganathan’s Second Law of Library Science, Every readers his or her book (Ranganathan, p. 81). The merchants did not just print any books. They made tremendous attempt to print the books that they thought would be in demand so they can profit. At that time, only religious books and pamphlets used by the clergy were in high demand.
“Just as the individual mind deteriorates when it is deprived of knowledge or information, so also society disintegrates when there is not a constant flow of knowledge among its members, and throughout the parts that comprise its structure and organization” (Shera, p. 122)
Shera, J.H. (1970). Sociological Foundation of Librarianship. Washington, D.C.: ASIS Publishing, 52-110.
The spoken and the written language is one of the many techniques for conveying information. As such, certain aspect of the language can be considered to fall within the realm of information science research and practice. The role of language in information science has not been explicated and explored in details. Considering that language is an instrument of communicant for conveying ideas, concepts, thought and knowledge between two cognitive entities, its study in information science is very compartmental. Considering the task of information science to untangle the mysteries of accessing the bewildering amount of information, there has been relatively very little theoretical research done in information science in utilizing the instrument of language in the process of (re)presentation of knowledge and thought into written textual artifact and the reverse process of interpreting the textual artifact for learning purposes. There seems to be a lack of integrated theoretical research regarding the content and its possible meaning.
Nevertheless, there has been some breakthrough in utilization of language related tools in information retrieval for manual and automated indexing of textual information as well as representation and formulation of queries with the help of various language constructs. The activity of utilizing language tools is known as natural language processing (NLP), with term discovering, identification and acquisition as the main tasks: “The two main activities involving terminology in NLP are term acquisitions, the automatic discovery of new terms, and term recognition, the identification of known terms in text corpora” (Jacquemin & Bourigault, p.6)
As far as information science is concerned, NLP techniques and tools are mostly applied in manual and automatic indexing and representation of information objects, as well as representation and formulation of requests. In addition, “indexes are useful for information seekers because they: support browsing, a basic mode of human information seeking; provide information seekers with a valid list of terms, instead of requiring users to invent the terms on their own….; are organized in a way that bring related information together…” (Wacholder et al., 2001, p.116). The terms appear to be very significant tools for the initial queries and the initial few modifications as they provide a controlled environment that aims at assisting the user in the right quest.
In his critical analysis of Harvard Business Review's article 'IT Doesn't Matter' (by Nicholas Carr), David Kirkpatrick makes the following comment in Stupid-Journal Alert: Why HBR's View of Tech Is Dangerous: "One of the article's most glaring flaws is its complete disregard for the centrality of software. Any human knowledge or information can be mediated and managed by software."
This comment is very simplistic and disregards the fact that the human knowledge is more than just information that can be managed by software. If by information Kirkpatrick defines the 'thing' that software is able to manipulate, then certainly human knowledge does not equate information. As such, Carr might be justified in principle when stating that 'IT Doesn't Matter'. Whether we have reached the time where IT really does not matter is another story. But, if information is different than the human knowledge (and appears to be so otherwise we would use the word informational as synonymous with knowledgeable) soon the IT infrastructure will reach a point where the actual knowledge will make a difference and NOT the 'information as thing' which is what IT (and software) can manipulate.
To the argument that knowledge can be manipulated and managed by software it should be added that knowledge is not something that resides in files or other entities that can be manipulated by software. If anything, inscriptions and writings in files and documents that claim to present certain knowledge, are (re)presentations only and an approximation of the human knowledge that can only reside in the human mind. So, what software can and does manipulate and manage is certainly not knowledge.
In defining the exosomatic memory concept, Newby quotes Brookes as saying that "An exosomatic memory system is a computerized system that operates as an extension to human memory. Ideally, use of an exosomatic memory system would be transparent, so that finding information would seem the same as remembering it to the human user" (Newby, p.1028).
Brookes's profound statement seems to have strongly influenced Newby into encapsulating the analysis in his article in terms of similarity and consistency between the cognitive space and the information space: "In exosomatic memory systems, the information space of systems will be consistent with the cognitive space of their human users" (Newby, p. 1026). Such emphasis on similarity and consistency seems to come from the fact that Brookes talks of exosomatic memory systems as extensions of the human memory. In explanation of the value and the use of information systems, Newby suggests that "to improve the utility of the information systems, we would like to identify representation schemes for data sets that are consistent with human perception of those data sets" (p.1030). Is it necessary for the information space to be similar and/or consistent with the cognitive space?
Cramton’s article identifies and analyses a multitude of problems constituting failures in the process of establishing and maintaining mutual knowledge (failure to communication and retain contextual information, unevenly distributed information, difficulty communication and understanding the salience of information, differences in speed to information access, and difficulty to interpret the meaning of silence), as well as few mechanisms for establishing and maintaining mutual knowledge (direct knowledge, interactional dynamics, and category membership). Both, the problems constituting failures and the mechanisms for establishing mutual knowledge have helped me explain team members’ behavior of project teams (dispersed and collocated) that I have been involved in the past, and appear to be good candidates for analyzing my involvement in future projects in workplace.
The definition of mutual knowledge as “the knowledge that communicating parties share in common and know they share” (Cramton 2001, p. 346) is an appropriate assumption base on various cultural, anthropological and communication studies, as well as from our everyday experience that we exchange information with others having in mind the contextual and situational background that help us understand and interpret each other. Only with common/shared understanding where interpretation and the meaning making process is compatible can we understand each other and actually communicate. In relation to organizational settings, the failure to establish and maintaining mutual knowledge has negative affects on dispersed team’s decision quality, productivity and relationships (p. 349).
“The new institutionalism in organization theory tends to focus on a broad but finite slice of sociology’s institutional cornucopia: organizational structures and processes that are industry wide, national or international scope” (Powell et al, p.9)
“Institutionalized arrangements are reproduced because individuals often cannot even conceive of appropriate alternatives (or because they regard as unrealistic the alternatives they can imagine. Institutions do not just constraint options: they establish the very criteria by which people discover their preferences. In other words, some of the most important sunk costs are cognitive” (Powell et al, p.11)
Starting from the premises of new institutionalism with its scope, constraints and criteria establishment, Orlikowski and Barley (2001) proceed to elaborate that information technology (IT) research and organization studies (OS) have much more in common than what has been already presented in scholarly communication and practice in both areas of study.
Considering that IT research is mostly practical in nature dealing with the design, deployment, and use of artifacts that represent tangible solutions to real-world problems (Orlikowski et al, p.146), and OS is theoretical as it develops and test parsimonious explanations for broad classes of phenomena (p.147), and that "organization studies (OS) and information technology (IT) are disciplines dedicated respectively to studying the social and technical aspects of organization" (p.146), they posit the differences between IT research and OS as epistemological in nature and not in the subject matter, treating the issues of organization at different level, emphasizing on the particular and the general respectively: "There can be no general knowing that is not somehow grounded in particulars and no particular explanation without some general perspective. Particulars are important for theory building, and theory is important for making sense of the specific" (p.147)
Whether one utilizes and appropriates the Actor-Network Theory paradoxically not as a theory but as methodological approach for ethnomethodology, or ANT as an actual theory in the true sense of a parsimonious theory with the classical philosophical understanding and ability to predict (i.e. cause-effect relationship) phenomena around us, two properties are common and fundamentally critical to any color, flavor or form ANT might have emerged and evolved into: inscription and translation, with their ability to act at distance.
The distinction of Actor-Network Theory from ANT is not only semantic in nature since “ANT” is not just an acronym for Actor-Network Theory. Going from Actor-Network Theory into ANT, the concepts, ideas and thoughts of the original inscription of the Actor-Network Theory performed and were perform upon in the web of scholarly discourse, thus translating themselves into self sustainable quasi theories. If actor-network theory was not reduced to ANT, perhaps it would not have been possible to become as pervasive as it has become but not without being translated, transformed and performed. This distinction is evident from Law’s and Latour’s statement in Law and Hassard (1999). In expressing his wishful thinking to recall ANT back to its origins, Latour, one of the original authors that laid down the principles of what has became ANT, states: