Conference abstract: Large-scale web-based collaboration is key for making science sustainable in the long runFri, 25/06/2010 - 6:43am | by daniel
In the spirit of Another Conference I did not attend, I embed below my summary of tweets on OKCon 2010, as well as the piratepad that pleasantly fulfilled the function of collaborative note taking, for which I had proposed a wiki-based attempt yesterday. While the former is unbeatably interactive (one of the best ways to attend a conference online if no audio/video is available), I think that the latter is more suitable for long-term archiving and structuring the information about the conference and its sessions and talks. Looking forward to another Etherpad-based attempt at OpenSciNY.
Later today, OKCon 2010 will take place — the fifth (or fourth, depending on whether WSFII 2005 counts or not) installment of the Open Knowledge Conference, organized on an annual basis by the Open Knowledge Foundation.
I have contributed to a paper (with @Tom Morris, who will present it) that is scheduled for the Community-Driven Research session and describes Citizendium as a platform for the collaborative structuring of knowledge by experts and the public. I cannot attend in person but will do so online via Twitter and Friendfeed, and this blend of wiki and microblogging on the same topic stimulated me to give collaborative blogging another try, this time via the wiki entry on the conference, embedded below. Caveat: only registered users can edit, but everyone can register, and approval rarely takes more than a few hours. If this is too late for you to keep your OKCon 2010 notes there, then the wiki can still serve to structure them later and to contextualize them. Or it can simply link to your blog posts, images and other materials on the matter.
Anyway, here we go for the wiki variant:
I couldn't attend CPOV 2010 in person but followed it via Twitter and took a number of screenshots, which I combined into this animation (3:10 min in total, at 0.5 frames per second). I also attached it as an animated gif at 1 frame per second but this may be just illustrative of information overflow.
Furthermore, as a follow-up to our previous conversations, Janet Haven from the Open Society Institute's Information Initiative sent me some supplementary questions in relation to their strategy (in which open science may or may not play a role, but it is now kind of short-listed as a potential major strategic element), on which I will briefly reflect here before passing on the ball to you.
Finally, a major scientific society asked me for input about the likely advantages and drawbacks of allowing, as per default, all content of the scientific sessions of their conferences to be broadcast live in any medium, and whether it would be sensible to make this a standard requirement whenever they sign the contract with the organizers of an upcoming conference.
I find the last item a bit daunting for tonight, so I will just link to a blog post on a related discussion (that of how to signal which way of broadcasting a conference is OK) and invite your comments, so that, hopefully, I can send them a useful reply within a few days.
Back in November, there was the abstract submission deadline for the 2010 Conference of the International Society for Ecological Economics (ISEE), and I had submitted a contribution entitled "What if science were sustainable?", promising to keep track of all further developments under the "ISEE-2010-sustainable-science" tag.
So here we go, the notification of acceptance just came in, containing these details on the review procedure:
The international response to the call for papers was overwhelming. We received about 1300 abstracts from 1100 registered submitters in 89 countries, with a generally very high quality. All abstracts have been evaluated and graded independently and anonymously by at least two members of our international review committee consisting of 96 reviewers. Abstracts have been allotted to reviewers on a random basis within the respective thematic foci. We will list all names of our review panel on our website. Based on the grades that we received for each abstract from our reviewers, we calculated an average grade for every abstract, and then ranked all abstracts accordingly. In cases where the span between two review results was significant a third review was collected. Double submissions were rejected. Most reviewers added comments to their reviews that can be accessed through the ConfTool system at https://www.conftool.com/
Via that ConfTool, I could indeed find the reviewer's reports, which I copy-pasted below (with thanks to the reviewers), in the spirit of promoting public peer review practices (a screenshot with the nicer original layout is attached):
Three avenues to support open approaches to science - the cases of funding, data acquisition and knowledge curationTue, 23/02/2010 - 2:06am | by daniel
We'd like to ask you to think about two to three emerging opportunities for--or threats to--open society institutions and values that you are aware of which are not receiving sufficient attention and where a funder like OSI could usefully intervene. We encourage you to suggest issues that are still very much on the horizon; there need not be an obvious solution to the points you raise.
I know that the OSI had and has many interesting projects running (also in regions and cultures normally off the radar, including some of those dear to me) but I have often (not just jokingly) taken its abbreviation to stand for "Open Science Institute", and so I take the liberty here to shrink the space of possible replies by concentrating on openness in science, anyway the most prominent topic in my blog.
My intuitive response would be that several inefficiencies in our current knowledge creation and curation systems cry for a test run of open approaches. Not sure whether I can distill this down to three issues, but let's get started by listing some of the ideas, and I hope that you can then help me structure and adapt them appropriately. To facilitate the discussion, I will resort to Cameron's depiction of the research cycle:
Following up on last night's demo of a paper-turned-into-wiki-article, I am adding below a pictorial summary of some of the key issues. The comments are meant to apply to a typical paper, not necessarily just this one or other papers in this journal.
While 140-character summaries of scientific papers seem to be the topic of today in some parts of my feedsphere (#sci140), I wish to get back to another way of making publications shorter and more efficient, as has been discussed before in various circumstances, e.g. under the label of micropublication.
Science is already a wiki if you look at it a certain way. It’s just a highly inefficient one -- the incremental edits are made in papers instead of wikispace, and significant effort is expended to recapitulate existing knowledge in a paper in order to support the one to three new assertions made in any one paper.
In this spirit, I have taken one of my articles whose licenses permit reuse and modifications and turned its abstract and introduction into a demo on how publishing in a wiki-style environment may look like.
The following is a reply to "On Citizendium", whose comment forms didn't accept me pasting in this comment from my text editor.
Thanks for the constructive feedback. Several points I wish to add:
- Real names are necessary at some point, since they provide a simple and time-honoured way to deal with the situation that "What hasn't kept pace with the technical innovation is the recognition that people need to engage in civil dialogue."
- The only articles about whose quality Citizendium makes any claim are Approved Articles. Currently, there are 121 of these. Yes, this is a very small number, largely due to (1) the small number of active contributors and (2) the complicated approval system, streamlining of which has long been on the agenda, but didn't proceed much because of (1), though we actually have discussed the combination of FlaggedRevisions with expertise as a possible solution. For all non-approved articles, no statement on the quality is made, but the real name requirement keeps vandalism fairly well at bay.
- Real names and Approved Articles are just some of the differentiators. Others include the use of subpages to structure information pertaining to an article's topic (e.g. Related Articles, which essentially replace categories for navigation).
- Larry has announced repeatedly that he will step down as Editor-in-Chief, and a Citizendium Charter is currently being drafted, according to which the project shall develop after this transition. In its current version, it covers aspects like dispute resolution, partnering with external organizations, and integration with teaching and research (activities by sizable communities for which the reliability aspect is essential). Comments very welcome.
This comment was originally posted as http://gfulibrarian.wordpress.com/2009/12/08/using-wikipedia/#comment-10 but seems to have been labeled as spam, probably due to the more than two links I added. So I repost it here:
Nice summary, but I would like to add that both models can and do indeed evolve. For instance, the "does not change" aspect is not true for journals like PLoS ONE (where articles can be annotated by any registered user of the site) and Scholarpedia (which is a scholarly review journal published on a wiki platform, hence with updatability).
The article Block cipher at the Citizendium has recently been approved, and since I found it a pleasant read and it is licensed CC-BY-SA, I paste it in below. Unfortunately, some of the formatting (particularly the references and equations) did not survive the transplant, so if anyone knows of a tool for publishing from a wiki to Drupal, I would appreciate a hint.