What's stopping us? That's the central question that the "open access" movement has been asking, and trying to answer, for the last two decades. Although tremendous progress has been made, with more knowledge freely available now than ever before, there are signs that open access is at a critical point in its development, which could determine whether it will ever succeedIt is a really impressive, accurate, detailed and well-linked history of how we got into the mess we're in, and a must-read despite the length. Below the fold, a couple of comments.
I'm David Rosenthal, and this is a place to discuss the work I'm doing in Digital Preservation.
Monday, June 20, 2016
Glyn Moody on Open Access
At Ars Technica, Glyn Moody writes Open access: All human knowledge is there—so why can’t everybody access it? , a long (9 "page") piece examining this question:
Friday, June 17, 2016
The Vienna Principles
This is an excellent piece of work, well worth reading and thinking about:
Between April 2015 and June 2016, members of the Open Access Network Austria (OANA) working group “Open Access and Scholarly Communication” met in Vienna to discuss [Open Science]. The main outcome of our considerations is a set of twelve principles that represent the cornerstones of the future scholarly communication system. They are designed to provide a coherent frame of reference for the debate on how to improve the current system. With this document, we are hoping to inspire a widespread discussion towards a shared vision for scholarly communication in the 21st century.Their twelve principles are:
- 1 Accessibility
- 2 Discoverability
- 3 Reusability
- 4 Reproducibility
- 5 Transparency
- 6 Understandability
- 7 Collaboration
- 8 Quality Assurance
- 9 Evaluation
- 10 Validated Progress
- 11 Innovation
- 12 Public Good
Thursday, June 16, 2016
Bruce Schneier on the IoT
John Leyden at The Register reports that Government regulation will clip coders' wings, says Bruce Schneier. He spoke at Infosec 2016:
This post will take over from Gadarene swine as a place to collect the horrors of the IoT. Below the fold a list of some of the IoT lowlights in the 17 weeks since then.
Government regulation of the Internet of Things will become inevitable as connected kit in arenas as varied as healthcare and power distribution becomes more commonplace, ... “Governments are going to get involved regardless because the risks are too great. When people start dying and property starts getting destroyed, governments are going to have to do something,” ... The trouble is we don’t yet have a good regulatory structure that might be applied to the IoT. Policy makers don’t understand technology and technologists don’t understand policy. ... “Integrity and availability are worse than confidentiality threats, especially for connected cars. Ransomware in the CPUs of cars is gonna happen in two to three years,” ... technologists and developers ought to design IoT components so they worked even when they were offline and failed in a safe mode."Not to mention the problem that the DMCA places researchers who find vulnerabilities in the IoT at risk of legal sanctions, despite the recent rule change. So much for the beneficial effects of government regulation.
This post will take over from Gadarene swine as a place to collect the horrors of the IoT. Below the fold a list of some of the IoT lowlights in the 17 weeks since then.
Wednesday, June 15, 2016
What took so long?
More than ten months ago I wrote Be Careful What You Wish For which, among other topics, discussed the deal between Elsevier and the University of Florida:
And those public-spirited authors who take the trouble to deposit their work in their institution's repository are likely to find that it has been outsourced to, wait for it, Elsevier! The ... University of Florida, is spearheading this surrender to the big publishers.Only now is the library community starting to notice that this deal is part of a consistent strategy by Elsevier and other major publishers to ensure that they, and only they, control the accessible copies of academic publications. Writing on this recently we have:
- Ellen Finnie and Greg Eow from the MIT Library.
- The Coalition of Open Access Policy Institutions steering committee.
- And Barbara Fister.
librarians need to move quickly to collectively fund and/or build serious alternatives to corporate openwashing. It will take our time and money. It will require taking risks. It means educating ourselves about solutions while figuring out how to put our values into practice. It will mean making tradeoffs such as giving up immediate access for a few who might complain loudly about it in order to put real money and time into long-term solutions that may not work the first time around. It means treating equitable access to knowledge as our primary job, not as a frill to be worked on when we aren’t too busy with our “real” work of negotiating licenses, fixing broken link resolvers, and training students in the use of systems that will be unavailable to them once they graduate.Amen to all that, even if it is 10 months late. If librarians want to stop being Elsevier's minions they need to pay close, timely attention to what Elsevier is doing. Such as buying SSRN. How much would arXiv.org cost them?
Tuesday, June 14, 2016
Decentralized Web Summit
Brad Shirakawa/Internet Archive |
Pictures and videos are up here. You should definitely take the time to watch, at least the talks on the second day by:
and the panel moderated by Kevin Marks, in particular this contribution from Zooko Wilcox. He provides an alternative view on my concerns about Economies of Scale in Peer-to-Peer Networks.
I am working on a post about my reactions to the first two days (I couldn't attend the third) but it requires a good deal of thought, so it'll take a while.
Monday, June 13, 2016
Eric Kaltman on Game Preservation
At How They Got Game, Eric Kaltman's Current Game Preservation is Not Enough is a detailed discussion of why game preservation has become extraordinarily difficult. Eric expands on points made briefly in my report on emulation. His TL;DR sums it up:
The current preservation practices we use for games and software need to be significantly reconsidered when taking into account the current conditions of modern computer games. Below I elaborate on the standard model of game preservation, and what I’m referring to as “network-contingent” experiences. These network-contingent games are now the predominant form of the medium and add significant complexity to the task of preserving the “playable” historical record. Unless there is a general awareness of this problem with the future of history, we might lose a lot more than anyone is expecting. Furthermore, we are already in the midst of this issue, and I think we need to stop pushing off a larger discussion of it.Well worth reading.
Tuesday, June 7, 2016
The Need For Black Hats
I was asked to provided some background for a panel on "Security" at the Decentralized Web Summit held at the Internet Archive. Below the fold is a somewhat expanded version.
Friday, June 3, 2016
He Who Pays The Piper
As expected, the major publishers have provided an amazingly self-serving response to the EUs proposed open access mandate. My suggestion for how the EU should respond in turn is:
When the EU pays for research, the EU controls the terms under which it is to be published. If the publishers want to control the terms under which some research is published, publishers should pay for that research. You can afford to.;-)