Because good research needs good data

DMPTool and DMPonline workshop

Read about the DMPTool and DMPonline workshop at IDCC in San Francisco

Sarah Jones | 25 February 2014
The workshops at IDCC began yesterday, and we were out in force in the joint DMPTool and DMPonline event. We began the workshop with a quick round of introductions. A few were already using DMPTool, but in the main people were interested in exploring the tools, specifically customising them to use as part of their RDM support services. 
Sarah Shreeves and myself kicked off the presentations with a policy update. Requirements are pretty stable in both contexts at the moment, as new DMP templates were issued by the NIH and UK Research Councils 1-2 years ago. In the States the major development is waiting on clarification of what will be required in light of the Office of Science and Technology Policy announcments, and in Europe the most recent development is the open data pilot under Horizon 2020
We then got down to the details, with a walkthrough of the changes to each tool. A lot of redevelopment has happened in both tools over the past year. For DMPTool this has been thanks to grants from the IMLS and Alfred Sloan Foundation. New features in the DMPTool include institutional branding, the ability to share plans, a DMP review function to check / approve content, and allowing institutions to add their own questions and templates rather than just guidance. An admin interface has been developed to allow institutions to manage their own customisations, which will take the load of Perry & Sherry. And more community guidance resources and publicity materials are available to help promote the tool on campus. The admin interface will be trialled soon, and the complete new version will be out by late Spring. It looks great, so follow the blog for news on when it launches.  
For DMPonline, the changes come in light of an evaluation we ran in Winter 2013. There were lots of features that users really liked which we’ve retained, such as the ability to share plans and add institutional templates and guidance. We’ve made a major change to the data model though to amend how the DCC Checklist for a DMP is used in the tool. This change allows funders and unis to ask their own questions and makes plans much shorter and more relevant to users. We gave a demo to show the changes and outlined the future plans and new feature requests. You can try it out at or watch a screencast that shows how it works.
The breakout discussions covered three topics:
  1. Outreach and advocacy
  2. Customising the tools
  3. Future development plans
In the advocacy breakout we discussed the roles and responsibilities - who should provide support and what form should this take to ensure it's scalable? We agreed that it should be a joint endeavour across IT, the library and research admin offices, with senior management support. There were also concerns about engaging academics and raising awareness of what they're committing to in DMPs.
Both tools can be customised in various ways, by adding guidance, suggested answers, new templates and institutional branding. There was a strong call for more example plans and a method to share customisations to help others organisations to get started. Administrators would also like to see more stats to understand how the tools are being used and by whom.
The final breakout on future plans added more tasks to the developers never-ending lists! The main suggestions were for more systems  integration, better versioning with the ability to flag different statuses to differentiate published / complete  versionings from drafts and tests, and building the user community to encourage collaborative development.
The workshop gave a lot of food for thought. I love the look of the new DMPTool and see lots of potential to take ideas from it. It’s heartening to see that similar features are evolving in both tools, such as the desire to do institutional customisations, share plans, and provide worked examples and suggested answers. It's always interesting to come together for these workshops and see how the tools compare. They're  increasingly converging so I think there's a lot of scope to continue the collaboration and learn from each others' lessons.