December 2003 Archives

open source politics?

| Permalink

From Clark Campaign Going Open Source:

"Clark's technology team announced Monday the launch of Clark TechCorps, an initiative to build a suite of free, open-source applications for campaigns and elections."

"The project will organize volunteers to write software for the Clark campaign and release their work under open-source licenses."

""Open source for us symbolizes organizational transparency. We really feel that it's important that all development we do has this methodology behind it," said Clark TechCorps project manager Josh Hendler."

Caution over 'computerised world'

| Permalink

Caution over 'computerised world'

"The team in Switzerland looked at the health, social and environmental implications of what is called pervasive computing."
...
"The idea behind pervasive computing is that everything around us contains some sort of electronic device."
...
"I am not saying I am against technology," he insisted, "but we should be aware there is a price to pay."

Indeed. Technology is here to stay intermingling with humans and other social structures. We need to be cautions and careful not to implement technologies that are restrictive and controlling. Instead, a drive towards technologies of openness should be made.

Free software to aid poor doctors

| Permalink

From Free software to aid poor doctors:

"A group of open source evangelists are looking to take the program called Vista beyond the borders of the US.
...
They say hospitals could save money by using the free software, as well as potentially saving patients' lives."

“The conditions associated with a particular class of conditions of existence produce habitus, systems of durable, transportable dispositions, structured structures predisposed to function as structuring structures, that is, as principles which generate and organize practices and representations that can be objectively adapted to their outcomes without presupposing a conscious aiming at ends or an express mastery of the operations necessary in order to attain them. Objectively ‘regulated’ and ‘regular’ without being in any way the product of obedience to rules, they can be collectively orchestrated without being the product of the organizing actions of a conductor” (Bourdieu, p. 53)

The above quote by Bourdieu, when viewed from the perspective of the society as the ‘habitus’, is quiet informing (in theory as well as in practice) of media’s interplay with the social structures within which they are embedded.  As we have seen throughout our course readings, media technologies—as important instruments at various levels of communication processes in the society, have encountered resistance by various cultural and social norms, and somewhat mixed response from economical and political forces because of their profit making potentials or power generation ability. More then any other type of technology, media and communication technologies have been the subject of public and scholarly debates because of their intrinsic characteristics to be able to convey (asynchronously) content across time and space (at distance), inscribed in form of data, information, images, knowledge, and wisdom, in mediums such as books, data tape drives, CD-ROMS, video and audio tapes, etc. Additionally, synchronous communication has enabled instantaneous communication among people (e.g. telephone, audio and video conferencing, online chat) enabling efficient, but not necessarily effective exchange of information, ideas, thoughts, and concepts.

The pervasive and widespread use of media technologies, often used ubiquitously for symbolic purposes, is also used by the governing elites to maintain the status quo and ensure stability. The necessity to reproduce and maintain a stable state, the habitus (to borrow from Bourdieu whose habitus concept is similar to the stable state produced and maintained by the hegemonic ideology), requires ways for disseminating cultural and political material of the dominant ideology. Similarly to how Bourdieu describes the functioning of the habitus, Gitlin defines the status quo as hegemony, “a ruling class’s (or alliance’s) domination of subordinate classes and groups through the elaboration and penetration of ideology (ideas and assumptions) into their common sense and everyday practice,” and contends that it “is systematic (but not necessary or even usually deliberate) engineering of mass consent to established order” (Gitlin, 1980, pp. 253). Further, elaborating on the aspect of hegemony and clarifying the composition of the elite, mostly government, corporate establishment and those institutions that produce cultural artifacts, Schiller (1996) explains their economic reason for cooperation: “The American economy is now hostage to a relatively small number of giant private companies, with interlocking connections, that set the national agenda. This power is particularly characteristic of the communication and information sector where the national cultural-media agenda is provided by a very small (and declining) number of integrated private combines. This development has deeply eroded free individual expression, a vital element of a democratic society” (Schiller, 1996, p. 44).

This paper will attempt to elaborate on the interplay between media and communication technologies, and social structures and forces (social, cultural, economical, political), whether institutionalized or not, emphasizing that both the content and the channels of communication through which the content is distributed are important factors in the production, maintenance and further reproduction of the artifacts of the dominant ideology. I will argue that the content that is being represented and recorded, when conveyed via open communication (such as the Internet), can show us the liberating potentials of various media technologies. As such, communication technologies are situated as important actors in the process to displacing or shifting the status quo.

open content, open communication everywhere!

| Permalink

From Copyright Doesn't Cover This Site:

"To prove that open sourcing any and all information can help students swim instead of sink, the University of Maine's Still Water new media lab has produced the Pool, a collaborative online environment for creating and sharing images, music, videos, programming code and texts. "
...
"We are training revolutionaries -- not by indoctrinating them with dogma but by exposing them to a process in which sharing culture rather than hoarding it is the norm," said Joline Blais, a professor of new media at the University of Maine and Still Water co-director.
...
"It's all about imagining a society where sharing is productive rather than destructive, where cooperation becomes more powerful than competition," Blais said.

What is Logistic Regression?
“Logistic regression allows one to predict a discrete outcome such as group membership from a set of variables that may be continuous, discrete, dichotomous, or a mix.” (Tabachnick and Fidell, 1996, p575)

What is Discriminant Analysis?
“The goal of the discriminant function analysis is to predict group membership from a set of predictors” (Tabachnick and Fidell, 1996, p507)

When/How to use Logistic Regression and Discriminant Analysis?
From the above definitions, it appears that the same research questions can be answered by both methods. The logistic regression may be better suitable for cases when the dependant variable is dichotomous such as Yes/No, Pass/Fail, Healthy/Ill, life/death, etc., while the independent variables can be nominal, ordinal, ratio or interval. The discriminant analysis might be better suited when the dependant variable has more than two groups/categories. However, the real difference in determining which one to use depends on the assumptions regarding the distribution and relationship among the independent variables and the distribution of the dependent variable.

So, what is the difference?
Well, for both methods the categories in the outcome (i.e. the dependent variable) must be mutually exclusive. One of the ways to determine whether to use logistic regression or discriminant analysis in the cases where there are more than two groups in the dependant variable is to analyze the assumptions pertinent to both methods. The logistic regression is much more relaxed and flexible in its assumptions than the discriminant analysis. Unlike the discriminant analysis, the logistic regression does not have the requirements of the independent variables to be normally distributed, linearly related, nor equal variance within each group (Tabachnick and Fidell, 1996, p575). Being free from the assumption of the discriminant analysis, posits the logistic regression as a tool to be used in many situations. However, “when [the] assumptions regarding the distribution of predictors are met, discriminant function analysis may be more powerful and efficient analytic strategy” (Tabachnick and Fidell, 1996, p579).

Even though the logistic regression does not have many assumptions, thus usable in more instances, it does require larger sample size, at least 50 cases per independent variable might be required for an accurate hypothesis testing, especially when the dependant variable has many groups (Grimm and Yarnold, p. 221). However, given the same sample size, if the assumptions of multivariate normality of the independent variables within each group of the dependant variable are met, and each category has the same variance and covariance for the predictors, the discriminant analysis might provide more accurate classification and hypothesis testing (Grimm and Yarnold, p.241). The rule of thumb though is to use logistic regression when the dependant variable is dichotomous and there are enough samples. [194:604]

References:
Grimm, L.G. & Yarnold, P.R. eds. (1995). Reading and Understanding Multivariate Statistics. Washington D.C.: American Psychological Association

Tabachnick, B.G. and Fidell, L.S. (1996). Using Multivariate Statistics. NY: HarperCollins

Scientific Research Backs Wisdom of Open Source

| Permalink

From Scientific Research Backs Wisdom of Open Source:

Few quotes:

"There's something going on in open-source development that is different from what we see in the textbooks," says Walt Scacchi, a senior research scientist at UC Irvine's Institute for Software Research.
...
"There's something going on in open-source development that is different from what we see in the textbooks," says Walt Scacchi, a senior research scientist at UC Irvine's Institute for Software Research.
...
"Open-source is not a poor version of software engineering, but a private-collective approach to large-software systems," Scacchi said.

Networking on the Network: A Guide to Professional Skills for PhD Students is an invaluable resource by Phil Agre for Ph.D. students and others who like to find their way around the modern day maze we call network (social and cyber).

Quote:
"The first thing to realize is that Internet-world is part of reality. The people you correspond with on the network are real people with lives and careers and habits and feelings of their own. Things you say on the net can make you friends or enemies, famous or notorious, included or ostracized. You need to take the electronic part of your life seriously. In particular, you need to think about and consciously choose how you wish to use the network. Regard electronic mail as part of a larger ecology of communication media and genres -- telephone conversations, archival journals and newsletters, professional meetings, paper mail, voice mail, chatting in the hallway, lectures and colloquia, job interviews, visits to other research sites, and so forth -- each with its own attributes and strengths. The relationships among media will probably change and new genres will probably emerge as the technologies evolve, but make sure that you don't harbor the all-too-common fantasy that someday we will live our lives entirely through electronic channels. It's not true."

Rice Virtual Lab in Statistics

| Permalink

Rice Virtual Lab in Statistics

"An online statistics book with links to other statistics resources on the web."

making sense of information

| Permalink

In To much information Nathan Cochrane makes a good point that despite the multitude of tools at our disposal that manage and manipulate information we have not necessarily become more informed decision makers. Perhaps the evens and issues we need to make informed decision have become so complex that the current tools (based on the utilitarian theoretical foundations) do not help us much.

"SCO letter is rubbish"

| Permalink

Are they (SCO folks) out of their mind? When did it become a violation (of any sort) to share for free your knowledge, expertise and any other product that may derive from it?

From Open-Source Legal Experts Dismiss SCO's Copyright Claims:

"The second opinion is where the rubbish lies," Carey said. "While the U.S. Constitution grants Congress broad powers to protect authors and inventors, it does not grant Congress to power to prevent authors and inventors from giving their work away (or from licensing it for free on the condition that derivative works also be licensed for free). Nor has Congress ever attempted to prevent authors and inventors from giving their work away, or licensing them for free. It is not illegal, immoral or unconstitutional to be generous with IP. Heaven help us if such an intellectual-property regime ever comes to pass."

I couldn't agree more!

TV, Violence and Aggression

| Permalink

In determining which of the four readings to analyze closer for this exercise, the Robinson, Wilde, Navracrus, Haydel and Varady (2001) article presents a more coherent research piece, in my viewpoint, primarily because the theoretical background is better understood (in comparison with the rest of the articles) considering that I’m not very well versed in behavioral and cognitive sciences which seem to be necessary to fully understand, appreciate and be able to provide constructive criticism. Further, Robinson et al. have gone to a great length to elaborate in details on their methodology, the measures used, and their rationale for using them, including a rather detailed report on the statistical procedure used with the corresponding results. The article ends with a great amount, relatively speaking, of concluding remarks including elaborations of limitations and strengths.

Unlike the other three articles that attempt to understand what happens with treatment group(s) when exposed to intervention that increases the dose of exposure to aggressive and violent media or exposure to media in general, the Robinson et al. article attempts to answer whether reduction in media exposure (reduced television, videotape and videogame use) has the effect to reduce violent and aggressive behaviors.

The basic premise in Robinson et al., as it has been shown by the rest of the articles, is that exposure to media increases violent and aggressive behavior (Centerwall, 1989), especially the exposure to violent and aggressive television and videotape viewing, results in the subjects to exhibit less sensitivity and concern about such behaviors when committed by others (Linz, Donnoerstein, & Penrod, 1984). Thus, Robinson et al. hypothesize that reduction in media exposure in general by reducing television, videotape and videogame use, reduces violent and aggressive behaviors in children.

Intel releases Open Source Lib - OpenML

| Permalink

From Intel releases Open Source Lib - OpenML:

"VANCOUVER, British Columbia, Dec. 8, 2003 -(LinuxElectrons)- Intel Corporation researchers have released software that allows developers to build computers that can "learn" from their experience, using data to proactively improve their own accuracy and the ease with which we use them. The announcement was made today at the opening of the Neural Information Processing Systems Conference (NIPS2003)."

"The software enables computers to estimate the likelihood that something will happen by calculating how often it occurred in the past. The software can be used to enhance a wide variety of interactive and industrial computer applications -- everything from culling through huge databases of gene studies to spot promising proteins for new drugs to email systems that create a model of a person's behavior to decide how best to manage newly arriving messages on its own. The software is available through Intel's Open Source Machine Learning Library (OpenML), a toolbox of functions that helps researchers develop machine learning applications."

An interesting development indeed! And it is open source.

From World meet to end digital divide starts in Geneva Saturday:

"Leaders from nearly 200 countries including 60 heads of state and government will attend the first World Summit on the Information Society (WSIS) in Geneva Saturday aimed at bridging the digital divide between the rich and poor."

"The aim of the United Nations summit is to come up with a global plan to ensure everyone's access to information and communications technologies."

Hopefully the attendants at the summit do not forget that ensuring access to information and communication technologies for everyone does NOT necessarily mean a reduction in the digital divide between rich and poor nations, countries, and peoples.

If history is any indication, we should have already learned that technology alone does not solve social problems, not necessarily, and perhaps not unless it can be shown so. For example, it would be beneficial to hear how does information technology help developing countries escape poverty. It might, if the means of production in the developing countries are improved to build self-sustainable economy based on access to information and information technology in general.

However, considering the conditions around the world at this stage, I would rather expect that activities related to building sustainable local economies (independently if they are related to information technology or no) are more important in escaping poverty. People in the developing countries can have access to all information technology they want (even this process is questionable because to achieve success with information technology one needs to first create the necessary economic conditions in order to bring the access to information technology to majority of the people) and still might not be able to escape poverty unless some sort of sustainable local economy is established to a certain degree.

From Faster, Better, Cheaper: Open-source Practices May Help Improve Software Engineering:

"ARLINGTON, Va. -- Walt Scacchi of the University of California, Irvine, and his colleagues are conducting formal studies of the informal world of open-source software development, in which a distributed community of developers produces software source code that is freely available to share, study, modify and redistribute. They're finding that, in many ways, open-source development can be faster, better and cheaper than the "textbook" software engineering often used in corporate settings."

The most relevant aspect of the engineering courses (my background) is the emphasis on the systems mode of thinking which has helped me tremendously in my present course of study here at SCILS, especially in Information Science.

So far, the challenge has been to build a frame of reference or a mindset through which one is able to see the problems related to information science and the resolutions proposed to resolve them. Personally, I believe that the systems way of thinking is a very insightful and powerful tool, especially because helps you study a problem by identifying the boundaries around it, its scope, what happens within the boundaries, and how the issues with the problem at hand interface with the environment (i.e. with outside of the relevantly defined boundary).

Another challenge for me was to adjust to the statistics methods used in social research. Despite the obvious difference between the statistical results of technical systems and those related to the relation between the independent and dependent variables in social phenomena, the statistics background from my engineering courses has helped me in the quest to identifying the conjecture between statistical analysis of engineering data and data gathered from information science experiments. Another benefit of engineering statistical courses is the ability they provide to better understand the fundamental background of the particular statistical tools, in light of the fact that courses that deal with statistics for social research emphasize mostly on usability and applicability of statistics, and do not necessarily stress enough on the actual derivation of the statistical tools and procedures.

The concepts of interconnectivity of various technical elements within the information and communication systems and the multitude of services they carry almost directly relate (albeit at a different level of application) to various practical communication tools and services that affect the social realm. An information and communication system is not a goal in its own; it is produced and used within the social web of interactions composed of human and non-human entities, or networked actors as suggested by the actor-network theory (ANT) and actor-network methodology. Considering that the actor-network theory considers human and non-human entities/elements in its analysis and methodology, it would be interesting to identify and describe a possible link between the variations and changes at the lowest levels of interactions (i.e. technological) and their potential effect on the interaction at the level between a system as a whole and the user(s).

Through these few reflections, I have attempted to link the experience and knowledge I have obtained from my engineering education and systems analyst/eng experience, with the role they have played so far in my PhD. level classes in Information Science. I hope to have more of these sorts of reflections in the future, as they pop-up in my head. :)

Who Owns The Facts?

| Permalink

Who Owns The Facts?

(courtesy of slashdot)
Quote:
"windowpain writes "With all of the furor over the Patriot Act a truly scary bill that expands the rights of corporations at the expense of individuals was quietly introduced into congress in October. In Feist v. Rural Tel. Serv. Co. the Supreme Court ruled that a mere collection of facts can't be copyrighted. But H.R. 3261, the Database and Collections of Information Misappropriation Act neatly sidesteps the copyright question and allows treble damages to be levied against anyone who uses information that's in a database that a corporation asserts it owns. This is an issue that crosses the political spectrum. Left-leaning organizations like the American Library Association oppose the bill and so do arch-conservatives like Phyllis Schlafly, who wrote an impassioned column exposing the bill for what it is the week after it was introduced."

By Mentor Cana, PhD
more info at LinkedIn
email: mcana {[at]} kmentor {[dot]} com

About this Archive

This page is an archive of entries from December 2003 listed from newest to oldest.

November 2003 is the previous archive.

January 2004 is the next archive.

Find recent content on the main index or look in the archives to find all content.