Earlier today, I presented at the national open research conference Hvordan endres forskningshverdagen når åpen forskning blir den nye normalen? The conference is organized by the Norwegian Forum for Open Research and is coordinated by Universities Norway. It has been great to follow the various discussions at the conference. One observation is that very few questions the transition to Open Research. We have, finally, come to a point where openness is the new normal. Instead, the discussions have focused on how we can move forwards. Having many active researchers in the panels also led to focus on solutions instead of policy.
Opening the process makes the researcher more carefully document everything. For example, nobody wants to make messy data or code available. Adding metadata and descriptions also help improve the quality of what is made available. It also helps in removing irrelevant content.
Making the different parts openly available is important for ensuring transparency in the research process. This allows reviewers (and others) to check claims in published papers. It also allows for others to replicate results or use data and methods in other research.
This openness and accessibility will ultimately lead to better quality control. Some people complain that we make available lots of irrelevant information. True, not everything that is made available will be checked or used. The same is the case for most other things on the web. That does not mean that nobody will never be interested. We also need to remember that research is a slow activity. It may take years for research results to be used.
Of course, we face many challenges when trying to work openly. As I have described previously, we particularly struggle with privacy and copyright issues. We also don’t have the technical solutions we need. That led me to my main point in the talk.
Connecting the blocks
The main argument in my presentation was that we need to think about connecting the various blocks in the Open Research puzzle. There has, over the last few years, been a lot of focus on individual blocks. First, making publications openly available (Open Access). Nowadays, there is a lot of discussion about Open Data and how to make data FAIR (Findable, Accessible, Interoperable, Reusable). There is also some development in the other building blocks. What is lacking today is a focus on how the different blocks are connected.
By developing individual blocks without thinking sufficiently about their interconnectedness, I fear that we lose out on some of the main points of opening everything. Moving towards Open Research is not only about making things open; it is about rethinking the way we research. That is the idea of the concept of Science 2.0 (or Research 2.0, as I would prefer to call it).
There is much to do before we can properly connect the blocks. But some elements are essential:
Persistent identifiers (PID): Having unique and permanent digital references that makes it possible to find and reuse digital material is essential for finding this. This could be DOIs for data, ORCID for researchers, and so on.
Timestamping: Many researchers are concerned about who did something first. For example, many people wait with releasing their data because they want to publish an article first. That is because the data (currently) does not have any “value” in itself. In my thinking, if data had PIDs and timestamping they would also be citable. This should also be combined with proper recognition of such contributions.
Version control: It has been common to archive various research results when the research is done. This is based on pre-digital workflows. Today, it is much better to provide solutions for proper version control of everything we are doing.
Fortunately, things move in the right direction. It is great to see more researchers try to work openly. That also exposes the current “holes” in infrastructures and policies.
It is challenging to find good illustrations to use in presentations and papers. For that reason, I hope to help others by sharing some of the illustrations I have made myself. I will share them with a permissive license (CC-BY) to be easily reused for various purposes.
I start with the “puzzle” that I often use in presentations about Open Research. It outlines some of the various parts of the research process and how they can be made (more) open. I often think about the blocks as placed on a timeline from left to right.
Feel free to grab the images above. If you are interested in better versions, I have posted PDFs of both versions and the source presentation files (in PPTX and ODP) on GitHub. So head over there to download either the Norwegian or English files.
What is the future of academic career assessment? How can open research practices be included as part of a research evaluation? These were some of the questions we asked ourselves in a working group set up by Universities Norway. Almost two years later, the report is ready. Here I will share some of the ideas behind the suggested Norwegian Career Assessment Matrix (NOR-CAM) and some of the other recommendations coming out of the workgroup.
EUA work on research assessment
I have for some years been Norway’s representative in the European University Association’s Expert Group on Open Science/Science 2.0 (on a side note, I have written elsewhere about why I think it should be called Open Research instead). The expert group meets 3-4 times a year, usually in Brussels but nowadays online, to discuss how Open Science principles can be developed and implemented in European universities.
A lot of things have happened in the world of Open Science during the three years that I have been in the expert group. Open access to publications is improving every day. Open access to research data is coming along nicely, although there are still many challenges. Despite the positive developments, there is one key challenge that we always get back to discussing: research assessment. How should researchers get their “points” in the system, who should get the job, and who should get a promotion?
Up until now, publication lists and citation counts have been the most important “currency” for researchers. We have, over the years, seen an unfortunate focus on metrics, like the h-index and the journal impact factor (and others). The challenge is that only asking for publication lists (and publication-related metrics) takes focus away from all the other elements of an open research ecosystem.
The need to rethink research assessment led to the EUA Webinar on Academic Career Assessment in the Transition to Open Science last year. As the title of the webinar shows, we decided to broaden the perspective from only thinking about research assessment to considering academic career assessment more generally. This also became the focus of the Universities Norway workgroup and the final report.
In the report we list six principles for the future of career assessment:
Measure quality and excellence through a better balance between quantitative and qualitative goals
Recognise several competencies as merits but not in all areas at the same time or by each employee
Assess all results, activities and competencies in the light of Open Science principles
Practice transparency in the assessment and visibility of what should be recognised as merit
Promote gender balance and diversity
Assist in the concrete practice of job vacancy announcements and assessment processes locally
The work group then went on to suggest four recommendations for different actors (individuals, institutions, research funders, government):
To establish a comprehensive framework for the assessment of academic careers that:
balances quantitative and qualitative goals and forms of documentation for academic standards and competencies
enables diverse career paths and promotes high standards in the three key areas: education, research and interaction with society
recognises the independent and individual competencies of academic staff as well as their achievements in groups and through collaboration
values ??Open Science principles (including open assessment systems)
values and encourages academic leadership and management
To engage internationally in developing a Norwegian assessment model because:
changes in the assessment criteria cannot be made by one country alone
a Norwegian model can contribute to related processes internationally
To use NOR-CAM as a practical and flexible tool for assessing academic results, competence and experience for academic personnel. NOR-CAM will highlight six areas of expertise through systematic documentation and reflection
To develop an ‘automagic CV system’ that enables academics to retrieve data that can be used to document competencies and results in their own career, including applications for positions, promotions and external funding.
Today, I presented the Norwegian report for the EUA workgroup. In many ways, the circle is completed. After all, the inspiration for the Norwegian report came directly from the work of EUA. Hopefully, the report can inspire others in Europe (and beyond) to think anew about career assessment.
Even though it took nearly two years, writing a report is only the beginning. Now it is time to work on how NOR-CAM can be implemented. I am looking forward to contributing to making it become a reality.
After a year of primarily online activities, we are slowly preparing for a new reality at the university. We will not go back to where we left off, but what will the new university be?
This post is inspired by a tweet by Rikke Toft Nørgård and a presentation she held on the development of the post-pandemic hybrid university. In the presentation, she points to a recent EUA report envisioning how universities should develop towards 2030. The aim is that universities should be:
Open, Transformative and Transnational
Sustainable, Diverse and Engaged
Strong, Autonomous and Accountable
These are good points. The challenge is to figure out how to make it happen. That is why I think it is good that EUA is bold enough to suggest three quite concrete action points:
Reform academic careers
Strengthen civic engagement
When I say “concrete” here, we need to consider that EUA is an organization with 800+ universities as members, so it is still fairly high-level advice. In the following, I will reflect briefly on each of these.
Reform academic careers
This is a topic that I have been engaged in for quite some time. As a member of the Young Academy of Norway, I was involved in developing several reports on the need for heterogeneous career paths in academia. People are different; fields are different, universities are different. Therefore, we also need to allow for various types of career paths. It is also important to help people more easily move in and out of academia.
As a member of a working group on career assessment at Universities Norway, we have been developing what we call the Norwegian Career Assessment Matrix (NOR-CAM). This has been inspired by the Open Science Career Assessment Matrix (OS-CAM). Our Norwegian model goes beyond only considering Open Science (which I would have preferred to be Open Research, but that is another story). Rather, we propose that researcher assessment should be based on many variables. More on that soon, since the report will be out in not too long.
This is another topic I have been interested in myself for a long time. In fact, my post trying to define interdisciplinarity is (by far) my most read article on this blog. I am lucky enough to co-direct RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion. Even the name suggests that we take interdisciplinarity seriously. At RITMO, musicologists, psychologists and informatics researchers work together in various ways. Not all of the research is interdisciplinary, however. Some are multi-, cross-, and transdisciplinary. The main point is that we challenge traditional disciplinary boundaries.
A lot of people talk about the need for working interdisciplinary. However, those of us that try to do it face several issues. It is challenging from an individual perspective. People are still largely assessed disciplinary. That is why a reform of the way we perform research assessment is important, as mentioned above.
There are also numerous institutional challenges. Most universities are disciplinarily organized into faculties and departments. There are good reasons for doing this, and I am not suggesting that we should get rid of faculties and departments altogether. However, universities need to be much more flexible in allowing people to research and teach across department and faculty borders.
I have for some time been promoting the development of matrix universities, in which the organization is both horizontal and vertical. Some European universities (for example, Cambridge and Oxford) and many American universities are organized both vertically (faculties and departments) and horizontally (colleges and schools). This is a more complex organization, but it promotes more meeting points. The problem with a matrix organization is that it may feel too rigid. A better metaphor may be “web” universities. This would allow for more complex interconnections across (and beyond) the organization.
Strengthen civic engagement
I find it particularly interesting that the EUA report so clearly focuses on creating universities “without walls”. For many of us that are on the inside of a university, we don’t really see these walls. After all, we have open doors in and out of the university. But it is important to acknowledge that there are walls that other people face.
Tearing down the walls may be difficult, however. After all, the good thing about walls is that they support the construction of the house and create a safe and sheltered space. But building a lot more doors in the house can be a good starting point.
To continue the analogy, I think that we should build universities with as many terrace glass walls as possible. That means that people can easily move in and out of the university. It also means that it is possible to look into the parts of the university that may be closed off. A move towards Open Research and Open Education is one approach to increasing the public visibility of what is going on inside universities. Citizen Science is another, in which researchers engage more actively with the general public.
There have been many unfortunate consequences of the corona. Fortunately, some changes may also happen quicker because more people realize that things need to change as we move on.
This week I led the workshop “Open Research Strategies and Tools in the NIME Community” at NIME 2019 in Porto Alegre, Brazil. We had a very good discussion, which I hope can lead to more developments in the community in the years to come. Below is the material that we wrote for the workshop.
Alexander Refsum Jensenius, University of Oslo
Andrew McPherson, Queen Mary University of London
Anna Xambó, NTNU Norwegian University of Science and Technology
Dan Overholt, Aalborg University Copenhagen
Guillaume Pellerin, IRCAM
Ivica Ico Bukvic, Virginia Tech
Rebecca Fiebrink, Goldsmiths, University of London
Rodrigo Schramm, Federal University of Rio Grande do Sul
The development of more openness in research has been in progress for a fairly long time, and has recently received a lot of more political attention through the Plan S initiative, The Declaration on Research Assessment (DORA), EU’s Horizon Europe, and so on. The NIME community has been positive to openness since the beginning, but still has not been able to fully explore this within the community. We call for a workshop to discuss how we can move forward in making the NIME community (even) more open throughout all its activities.
The aim of the workshop is to:
Agree on some goals as a community.
Showcase best practice examples as a motivation for others.
Promote existing solutions for NIME researcher’s needs.
Consider developing new solutions, where needed.
Agree on a set of recommendations for future conferences, to be piloted in 2020.
WelcomeIntroduction of participantsIntroduction to the topic
Alexander Refsum Jensenius
Open Publication perspectives
Alexander Refsum JenseniusDan OverholtRodrigo Schramm
Group-based discussion:How can we improve the NIME publication template?Should we think anew about the reviewing process (open review?)Should we open for a “lean publishing” model?How do we handle the international nature of NIME?
Group-based discussion:What are some best practice Open NIME examples?What tools/solutions/systems should be promoted at NIME?Who should do the job?
End of workshop
The following sections present some more information about the topic, including current state of affairs in the field.
What is Open Research?
There are numerous definitions of what Open Research constitutes. The FOSTER initiative has made a taxonomy, with these overarching branches :
Open Access: online, free of cost access to peer reviewed scientific content with limited copyright and licensing restrictions.
Open Data: online, free of cost, accessible data that can be used, reused and distributed provided that the data source is attributed.
Open Reproducible Research: the act of practicing Open Science and the provision of offering to users free access to experimental elements for research reproduction.
Open Science Evaluation: an open assessment of research results, not limited to peer-reviewers, but requiring the community’s contribution.
Open Science Policies: best practice guidelines for applying Open Science and achieving its fundamental goals.
Open Science Tools: refers to the tools that can assist in the process of delivering and building on Open Science.
Not all of these are equally relevant in the NIME community, while others are missing.
Openness in the NIME Community
The only aspect that has been institutionalized in the NIME community is the conference proceedings repository. This has been publicly available from the start at nime.org, and in later years all publications have also enforced CC-BY-licensing.
Other approaches to openness are also encouraged, and NIME community members are using various types of open platforms and tools (see the appendix for details):
Source code repositories
Experiment data repositories
Music performance repositories
The question is how we can proceed in making the NIME community more open. This includes the conferences themselves, but also other activities in the community. A workshop on making hardware designs openly available was held in connection to NIME 2016 , and the current project proposal may be seen as a natural extension of that discussion.
The Problem with the Term “Open Science”
Many of the initiatives driving the development of more openness in research refer to this as “Open Science”. In a European context this is particularly driven by some of the key players, including the European Union (EU), the European Research Council (ERC), and the European University Association (EUA). Consequently a number of other smaller institutions and individuals are also using the term, often without thinking very much about the wording.
The main problem with using Open Science as a general term, is that it sounds like this is not something for researchers working in the arts and humanities. This was never the intention, of course, but was more the result of the movement developing from the sciences, and it is difficult to change a term when it has gotten some momentum.
NIME is—and is striving to continue to be—an inclusive community of researchers and practitioners coming from a variety of backgrounds. Many people at NIME would not consider that they work (only) in “science”, but would perhaps feel more comfortable under the umbrella “research”. This term can embrace “scientific research”, but also “artistic research” and R & D found outside of academic institutions. Thus using the term “Open Research” fits better for the NIME community than “Open Science”.
The question of freedom is also connected to the that of openness. In the world of software development, one often talks about “free as in Speech” (libre) or “free as in Beer” (gratis). This point also relates to issues connected to licensing, copyright and reuse. Many people in the community are not affiliated with institutions, and receive payment from their work. Open research might have a close connection with open source, open hardware and open patent. This modern context for research and development of new musical technologies are also beyond academia and must be well planned in order to also attract the industry as partners. How can this be balanced with the needs for openness?
Another term that is increasingly used in the community is that of the FAIR principles, which stands for Findable, Accessible, Interoperable and Reusable. It is important to point out that FAIR is not the same as Open. Even though openness is an overarching aim, there is an understanding that privacy matters and copyright issues are preventing general openness of everything. Still the aim is to make data as open as possible, as closed as necessary. By applying the FAIR principles, it is possible to make metadata available so that it is openly known what types of data exist, and how to ask for access, even though the data may have to be closed.
There are various “bucket-based” repositories that may be used, such as:
What is positive about such repositories is that you can store anything of (more or less) any size. The challenge, however, is the lack of specific metadata, specialized tools (such as visualization methods), and a community.
There are also specific solutions, such as Github for code sharing.
As of 2018 a new repository aimed at coupling benefits of the aforesaid “bucket-based” approach with a robust metadata framework, titled COMPEL, has been introduced. It seeks to provide a convergence point to the diverse NIME-related communities and provide a means of linking their research output.
Openness in the Music Technology community
Looking at many other disciplines, the music technology community has embraced open perspectives in many years. A number of the conferences make their archives publicly available, such as: