More research should be solid instead of novel

Novelty is often highlighted as the most important criterion for getting research funding. That a manuscript is novel is also a major concern for many conference/journal reviewers. While novelty may be good in some contexts, I find it more important that research is solid.

I started thinking about novelty versus solidity when I read through the (excellent) blog posts about the ISMIR 2021 Reviewing Experience. These blog posts deal with many topics, but the question about novelty caught my attention. Even though the numbers are small, it turned out that the majority of the survey respondents listed novelty as the most important selection criterion for the conference. This is not unique to ISMIR; I think many journals and conferences ask about novelty.

Defining novelty

Given that novelty is a criterion “everyone” considers all the time, few people discuss what it actually means. What does it actually mean that something is novel? Merriam-Webster suggests that it is “something new or unusual.” But what should be new or unusual? The questions? The answers? The methods?

Research is about contributing new knowledge to humankind. After all, it is not really any point in reinventing the wheel. Still, most research is incremental. We all stand on the shoulders of giants. New research questions spring out of the “future work” sections of our colleagues’ articles. Our methods are based on the refinement of disciplinary developments. Even so-called “groundbreaking” projects are incremental in nature if you scrutinize the details. Still, we have an idea that “something unheard of before” is ideal.

Research needs to be solid

My research is creative in both form and content. As such, many people think that my projects are novel in the sense of being new. I also work both multi- and interdisciplinary, which means that I don’t really fit well anywhere. That could also be considered novel in the sense of being unusual. Still, what I am doing is not particularly new or unusual. From my perspective, I am working incrementally—everything I am doing builds on other people’s work. True, I combine theories and methods from different fields. This makes it look novel.

I can illustrate this with a research project I just finished: MICRO. Over the last years, we have studied human music-related micromotion, the smallest actions it is possible to produce and perceive. This is new because no one has studied such motion in a musical context before. It is also unusual because the team comprised researchers from musicology, psychology, human movement science, and computer science.

The MICRO project can be considered novel. However, does that mean that everything we did in the project was novel? Some parts were, I guess. For example, we collected data by running the Norwegian Championship of Standstill annually. This was new and unusual the first time we did it. We even got quite a lot of media interest (it is not so often that music research is featured in the sports news on national TV).

However, collecting data once does not make for outstanding science. Research is about asking questions, finding answers, and verifying those answers. Repeating experiments, making slight modifications to the research design, improving the methods, refining the analyses. That is what solid research is about.

I have researched human music-related micromotion for nearly ten years now. We have some answers, but there are many open questions. Many of these questions are neither new nor unusual any longer. But if we want to understand more about what is actually going on inside our bodies when we experience music, we need to continue researching what is not any longer new and unusual. That is about doing solid research, not novel.

Open Research is better research

I believe that open research is better research. Opening the research process makes researchers think more carefully about what they do and how they document it. This takes (some) more time than working closed. But it also makes it easier for others to understand what has been done. This is important from a peer review perspective. It also facilitates incremental research.

The MICRO project has been an open research flagship project. I began by sharing the funding application openly. Throughout the project, we have continuously described how we have worked. The data has been released in the Oslo Standstill Database, and source code has been shared on GitHub. All of this has taken time “away” from publishing journal articles. However, it is time for researchers to publish fewer articles and focus more on making more data, code, etc., available.

Opening the research process is part of solidifying the research. As researchers, we cannot hide behind a “black box” any longer. Everyone can scrutinize what we have done. In fact, I hope that more people will analyze our data and develop our code. That is part of the incremental nature of science.

Summing up

I am not against novel research. However, I think we have gotten to a point where there is too much focus on novelty. If you are applying for a large research grant, it may make sense to doing something new. But it must be possible to submit a presentation to a conference or a manuscript to a journal based on plain, solid research. That may, in fact, be novel in itself! Hopefully, the transition to open research may actually help to focus more on solidity instead of novelty.

Open Research puzzle illustration

It is challenging to find good illustrations to use in presentations and papers. For that reason, I hope to help others by sharing some of the illustrations I have made myself. I will share them with a permissive license (CC-BY) to be easily reused for various purposes.

I start with the “puzzle” that I often use in presentations about Open Research. It outlines some of the various parts of the research process and how they can be made (more) open. I often think about the blocks as placed on a timeline from left to right.

An English-language version of my Open Research “puzzle.”

We will have a Norwegian conference on Open Research later this year, so I decided to make a version in Norwegian too for the conference page:

A Norwegian-language version of the Open Research “puzzle.”

Feel free to grab the images above. If you are interested in better versions, I have posted PDFs of both versions and the source presentation files (in PPTX and ODP) on GitHub. So head over there to download either the Norwegian or English files.

Why universities should care about employee web pages

Earlier this year, I wrote about my 23 tips to improve your web presence. Those tips were meant to encourage academics to care about how their employee web pages look at universities. Such pages look different from university to university. Still, in most places, they contain an image and some standard information on the top, followed by more or less structured information further down. For reference, this is an explanation of how my employee page is built up:

My employee page at UiO contains both standard and customized elements.

Arguments for why universities should care

Academics need to be visible online. If you don’t publish and disseminate your research, it won’t have an impact. So it is in our own interest to have up-to-date personal pages with information about what we do. I would argue that it is also in the interest of universities that their employee’s personal pages are up-to-date and look good. My argument goes like this:

  1. The people are the most important asset of a university
  2. The web is the most important dissemination channel
  3. Hence the employee pages should be the most important part of a university web page

For some reason, this appears to be a radical statement. In my experience, many universities think of the employee pages as a “phonebook.” It is a static page with minimal information about how to get in touch with the employee. There is often a template for the page, including information about educational background. Sometimes there is also information about courses taught and research output. But rarely does it contain much more stuff. There also seems to be little institutional interest in maintaining and improving such web pages.

The employee page is an important research infrastructure

Last year, I wrote in the UiO newspaper about considering the university’s web pages as a research infrastructure. In my experience, the web pages of a university are maintained by a communication department. They have one opinion on the purpose of the web pages: communication with students and the general public. I agree that these are the largest user groups of our web pages. But, in addition, we need to remember that researchers also communicate with other researchers.

Research communication is not the same as researcher communication. The former is dissemination activities targeting the broad public. The latter is based on ongoing intellectual exchange with research colleagues around the world. In my experience, communication departments care mostly about the first category. It is typically the role of university libraries to care about the other. Unfortunately, not many librarians are involved in the making and structuring of web pages. In my opinion, they should be.

Web pages as part of the transition to Open Research

I have previously written about why Open Research is better research and why I prefer Open Research over Open Science. In this context, I would mention that I believe university web pages, particularly employee web pages, are key to making a full transition to Open Research. Yes, we should focus on making publications Open Access and datasets FAIR. That should happen through proper repositories with unique IDs, and so on. However, the ambition of moving towards Science 2.0 goes beyond only opening the research results. We also need to open the various parts of the research process. Then I am thinking about the various building blocks of Open Research, as sketched in this figure:

The building blocks of an Open Research ecosystem.

The various parts in this ecosystem will live in different repositories and be scattered around the web. In my thinking, a person’s employee page is the place to gather all this information. It can serve as the hub of an academic’s activities.

Empowering the academics

It is in the interest of universities to provide their employees with the tools needed to store, share, and link to their research material. Many universities don’t seem to care too much about this. The result is that many employees don’t care either. Those who care will make their own solutions. Many set up private web pages with their own domains. Others use one of the social media sites for academics. These sites have understood how to make it fun to add information. It is a pity that universities don’t do the same.

At UiO, we are fortunate to have the possibility to edit the content of our personal pages. There are still things that could improve our system, and I regularly nag both the IT and communication departments about those issues. Still, I am fortunate to work at a university that empowers its academics with the possibility to update their own information. That is not the case in all universities. Some universities don’t allow employees to modify anything at all. I think that is a bad idea. It is bad for the employees, the university, and the transition to Open Research.

To all university leaders out there: how do you work with your university’s employee pages? To all academics: remember to update your personal page! And if you are not allowed to, ask your leaders to give you the tools and access to do so.

Launching NOR-CAM – A toolbox for recognition and rewards in academic careers

What is the future of academic career assessment? How can open research practices be included as part of a research evaluation? These were some of the questions we asked ourselves in a working group set up by Universities Norway. Almost two years later, the report is ready. Here I will share some of the ideas behind the suggested Norwegian Career Assessment Matrix (NOR-CAM) and some of the other recommendations coming out of the workgroup.

The Norwegian Career Assessment Matrix (NOR-CAM).

EUA work on research assessment

I have for some years been Norway’s representative in the European University Association’s Expert Group on Open Science/Science 2.0 (on a side note, I have written elsewhere about why I think it should be called Open Research instead). The expert group meets 3-4 times a year, usually in Brussels but nowadays online, to discuss how Open Science principles can be developed and implemented in European universities.

A lot of things have happened in the world of Open Science during the three years that I have been in the expert group. Open access to publications is improving every day. Open access to research data is coming along nicely, although there are still many challenges. Despite the positive developments, there is one key challenge that we always get back to discussing: research assessment. How should researchers get their “points” in the system, who should get the job, and who should get a promotion?

Up until now, publication lists and citation counts have been the most important “currency” for researchers. We have, over the years, seen an unfortunate focus on metrics, like the h-index and the journal impact factor (and others). The challenge is that only asking for publication lists (and publication-related metrics) takes focus away from all the other elements of an open research ecosystem.

Various building blocks in an open research ecosystem.

The need to rethink research assessment led to the EUA Webinar on Academic Career Assessment in the Transition to Open Science last year. As the title of the webinar shows, we decided to broaden the perspective from only thinking about research assessment to considering academic career assessment more generally. This also became the focus of the Universities Norway workgroup and the final report.

Six principles

In the report we list six principles for the future of career assessment:

  1. Measure quality and excellence through a better balance between quantitative and qualitative goals
  2. Recognise several competencies as merits but not in all areas at the same time or by each employee
  3. Assess all results, activities and competencies in the light of Open Science principles
  4. Practice transparency in the assessment and visibility of what should be recognised as merit
  5. Promote gender balance and diversity
  6. Assist in the concrete practice of job vacancy announcements and assessment processes locally

Four recommendations

The work group then went on to suggest four recommendations for different actors (individuals, institutions, research funders, government):

  1. To establish a comprehensive framework for the assessment of academic careers that:
    • balances quantitative and qualitative goals and forms of documentation for academic standards and competencies
    • enables diverse career paths and promotes high standards in the three key areas: education, research and interaction with society
    • recognises the independent and individual competencies of academic staff as well as their achievements in groups and through collaboration
    • values ??Open Science principles (including open assessment systems)
    • values and encourages academic leadership and management
  2. To engage internationally in developing a Norwegian assessment model because:
    • changes in the assessment criteria cannot be made by one country alone
    • a Norwegian model can contribute to related processes internationally
  3. To use NOR-CAM as a practical and flexible tool for assessing academic results, competence and experience for academic personnel. NOR-CAM will highlight six areas of expertise through systematic documentation and reflection
  4. To develop an ‘automagic CV system’ that enables academics to retrieve data that can be used to document competencies and results in their own career, including applications for positions, promotions and external funding.

Follow-up

Today, I presented the Norwegian report for the EUA workgroup. In many ways, the circle is completed. After all, the inspiration for the Norwegian report came directly from the work of EUA. Hopefully, the report can inspire others in Europe (and beyond) to think anew about career assessment.

Even though it took nearly two years, writing a report is only the beginning. Now it is time to work on how NOR-CAM can be implemented. I am looking forward to contributing to making it become a reality.

Read the full report here:

MusicTestLab as a Testbed of Open Research

Many people talk about “opening” the research process these days. Due to initiatives like Plan S, much has happened when it comes to Open Access to research publications. There are also things happening when it comes to sharing data openly (or at least FAIR). Unfortunately, there is currently more talking about Open Research than doing. At RITMO, we are actively exploring different strategies for opening our research. The most extreme case is that of MusicLab. In this blog post, I will reflect on yesterday’s MusicTestLab – Slow TV.

About MusicLab

MusicLab is an innovation project by RITMO and the University Library. The aim is to explore new methods for conducting research, research communication and education. The project is organized around events: a concert in a public venue, which is also the object of study. The events also contain an edutainment element through panel discussions with world-leading researchers and artists, as well as “data jockeying” in the form of live data analysis of recorded data.

We have carried out 5 full MusicLab events so far and a couple of in-between cases. Now we are preparing for a huge event in Copenhagen with the Danish String Quartet. The concert has already been postponed once due to corona, but we hope to make it happen in May next year.

The wildest data collection ever

As part of the preparation for MusicLab Copenhagen, we decided to run a MusicTestLab to see if it is at all possible to carry out the type of data collection that we would like to do. Usually, we work in the fourMs Lab, a custom-built facility with state-of-the-art equipment. This is great for many things, but the goal of MusicLab is to do data collection in the “wild”, which would typically mean a concert venue.

For MusicTestLab, we decided to run the event on the stage in the foyer of the Science Library at UiO, which is a real-world venue that gives us plenty of challenges to work with. We decided to bring a full “package” of equipment, including:

  • infrared motion capture (Optitrack)
  • eye trackers (Pupil Labs)
  • physiological sensors (EMG from Delsys)
  • audio (binaural and ambisonics)
  • video (180° GoPros and 360° Garmin)

We are used to working with all of these systems separately in the lab, but it is more challenging when combining them in an out-of-lab setting, and with time pressure on setting everything up in a fairly short amount of time.

Musicians on stage with many different types of sensors on, with RITMO researchers running the data collection and a team from LINK filming.

Streaming live – Slow TV

In addition to actually doing the data collection in a public venue, where people passing by can see what is going on, we decided to also stream the entire setup online. This may seem strange, but we have found that many people are actually interested in what we are doing. Many people also ask about how we do things, and this was a good opportunity to show people the behind-the-scenes of a very complex data collection process. The recording of the stream is available online:

To make it a little more watcher-friendly, the stream features live commentary by myself and Solveig Sørbø from the library. We talk about what is going on and make interviews with the researchers and musicians. As can be seen from the stream, it was a quite hectic event, which was further complicated by corona restrictions. We were about an hour late for the first performance, but we managed to complete the whole recording session within the allocated time frame.

The performances

The point of the MusicLab events is to study live music, and this was also the focal point of the MusicTestLab, featuring the very nice, young student-led Borealis String Quartet. They performed two movements of Haydn’s Op. 76, no. 4 «Sunrise» quartet. The first performance can be seen here (with a close-up of the motion capture markers):

The first performance of Haydn’s string quartet Op. 76, no. 4 (movements I and II) by the Borealis String Quartet.

Then after the first performance, the musicians took off the sensors and glasses, had a short break, and then put everything back on again. The point of this was for the researchers to get more experience with putting everything on properly. From a data collection point of view, it is also interesting to see how reliable the data are between different recordings. The second performance can be seen here, now with a projection of the gaze from the violist’s eye-tracking glasses:

The second performance of Haydn’s string quartet Op. 76, no. 4 (movements I and II) by the Borealis String Quartet.

A successful learning experience

The most important conclusion of the day was that it is, indeed, possible to carry out such a large and complex data collection in an out-of-lab setting. It took an hour longer than expected to set everything up, but it also took an hour less to take everything down. This is valuable information for later. We also learned a lot about what types of clamps, brackets, cables, etc., that are needed for such events. Also useful is the experience of calibrating all the equipment in a new and uncontrolled environment. All in all, the experience will help us in making better data collections in the future.

Sharing with the world

Why is it interesting to share all of this with the world? RITMO is a Norwegian Centre of Excellence, which means that we get a substantial amount of funding for doing cutting-edge research. We are also in a unique position to have a very interdisciplinary team of researchers, with broad methodological expertise. With the trust we have received from UiO and our many funding agencies, we, therefore, feel an obligation to share as much as possible of our knowledge and expertise with the world. Of course, we present our findings at the major conferences and publish our final results in leading journals. But we also believe that sharing the way we work can help others.

Sharing our internal research process with the world is also a way of improving our own way of working. Having to explain what you do to others help to sharpen your own thinking. I believe that this will again lead to better research. We cannot run MusicTestLabs every day. Today all the researchers will copy all the files that we recorded yesterday and start on the laborious post-processing of all the material. Then we can start on the analysis, which may eventually lead to a publication in a year (or two or three) from now. If we do end up with a publication (or more) based on this material, everyone will be able to see how it was collected and be able to follow the data processing through all its chains. That is our approach to doing research that is verifiable by our peers. And, if it turns out that we messed something up, and that the data cannot be used for anything, we have still learned a lot through the process. In fact, we even have a recording of the whole data collection process so that we can go back and see what happened.

Other researchers need to come up with their approaches to opening their research. MusicLab is our testbed. As can be seen from the video, it is hectic. Most importantly, though, is that it is fun!

RITMO researchers transporting equipment to MusicTestLab in the beautiful October weather.