More than just everyday musings...

Have an idea for a topic you’d like to share (or you’d like us to write)? Let us know! Email help@sciencegateways.org with your idea.

Back

Report from 2017 NSF Cybersecurity Summit for Large Facilities and Cyberinfrastructure, August 15-17, 2017

By Nancy Wilkins-Diehr

This is the 10th year of the NSF Cybersecurity Summit for Large Facilities and Cyberinfrastructure since 2004 and my first time at the Summit. CTSC does an extraordinarily good job of organizing these and including the community in the agenda-setting. The room could not hold another registrant, it was completely packed. They had a record attendance of 120 and a waiting list.

I had to miss portions of the meeting for other calls and meetings but was pleased to catch the opening keynotes from Irene Qualters, director of the Office of Advanced Cyberinfrastructure at the NSF and Jeff Spies from the Center for Open Science. Slides for the event are posted here.

Irene Qualters spoke about a number of important topics - research funding trends (and the broad mix of disciplines covered by NSF that make it unique), national priorities that NSF supports, NSF’s expansive view of what constitutes CI and development of new CI priorities based on wide community input. Gateways were, of course, mentioned as an important component of CI.

Jeff Spies gave a very thoughtful talk entitled “A Workflow-Centric Approach to Increasing Reproducibility and Data Integrity.” Jeff is also the developer of the Open Science Framework (OSF). He gave a terrific description of publication bias—where our conclusions are driven by what we want to see—using several optical illusions showing us that our brains don’t always give us accurate information.

Jeff then went through the research process. Typically researchers design, collect and analyze, report and then publish. Before they publish, there is peer review. What if that review took place in the design phase so that publication bias doesn’t enter in so that researchers are judged on the impact of the questions and not the results?

Jeff has also seen success in the use of badges awarded by journals to encourage open data. Data sharing increased ten fold in journals that were using badges. There is power to nudging incentives. But, we can’t always change incentive structures. As tech developers, how can we do this? How to alter research workflow?

Appending openness is “just another thing” researchers have to do. This is true of reproducibility, reporting, and preservation. OSF tries to integrate these “good to do things” with existing workflow. Don’t learn something new. Open workflows are the key. I found Jeff’s examples very compelling and very relevant for science gateway developers, especially those looking at supporting the complete research pipeline.

Marjory Blumenthal, RAND, Cybersecurity Lessons from Studying Big Data and Privacy. Sharing experiences on PCAST, the President's Council of Advisors on Science and Technology. PCAST’s recommendations on cybersecurity included:

  • Focus more on actual use, less on collection

  • Design policy around outcomes, not specific technologies

  • Strengthen U.S. research in privacy-related technologies and relevant social science

  • Promote increased education and training for privacy protection

  • U.S. government should lead by example, sustain a leading position

There was a wide-ranging discussion on privacy, particularly as it relates to the internet of things. Finally, there was a very dynamic panel featuring representatives from three cloud providers - Susie Adams (Azure/Microsoft), Mark Ryland (AWS), Matthew O’Connor (Google). For cybersecurity professionals, this was a very well run, community designed event.