Below you will find pages that utilize the taxonomy term “jupyter”
August 7, 2023
Making image parts transparent in Python
As part of my year-long #StillStanding project, I post an average image of the spherical video recordings on Mastodon daily. These videos have black padding outside the fisheye-like images, and this padding also appears in the average image.
It is possible to manually remove the black parts in some image editing software (of which open-source GIMP is my current favorite). However, as I recently started exploring ChatGPT for research, I decided to ask for help.
June 12, 2023
Running a Jupyter Notebook in Conda Environment
I have been running Python-based Jupyter Notebooks for some time but never thought about using environments before quite recently. I have heard people talking about environments, but I didn’t understand why I would need it.
Two days ago, I tried to upgrade to the latest version of the Musical Gestures Toolbox for Python and got stuck in a dependency nightmare. I tried to upgrade one of the packages that choked, but that only led to other packages breaking.
May 20, 2023
The effect of skipping frames for video visualization
I have been exploring different video visualizations as part of my annual stillstanding project. Some of these I post as part of my daily Mastodon updates, while others I only test for future publications.
Most of the video visualizations and analyses are made with the Musical Gestures Toolbox for Python and structured as Jupyter Notebooks. I have been pondering whether skipping frames is a good idea. The 360-degree videos that I create visualizations from are shot at 25 fps.
April 10, 2023
100 Days and Still Standing
Today marks the 100th day of my annual #StillStanding project. In this blog post, I summarize some of my experiences so far.
Endurance Some people questioned whether I would be able to stand still every single day for an entire year. But, hey, it is only ten minutes (out of 1440) per day, and even though my life as a centre director is busy, it is always possible to find time for a standstill sometime during the day.
January 12, 2023
Running a workshop with a Jupyter Notebook presentation
Today, I ran a workshop called Video Visualization together with RITMO research assistant Joachim Poutaraud. The workshop was part of the Digital Scholarship Days 2023 organized by the University of Oslo Library, four days packed of hands-on tutorials of various useful things.
Presentation slides made by Jupyter Notebook Joachim has done a fantastic job updating the Wiki with all the new things he has implemented in the toolbox. However, the Wiki is not the best thing to use in a workshop, it has too much information and would create an information overload for the participants.
January 3, 2023
Testing Mobile Phone Motion Sensors
For my annual Still Standing project, I am recording sensor data from my mobile phone while standing still for 10 minutes at a time. This is a highly curiosity-driven and data-based project, and part of the exploration is to figure out what I can get out of the sensors. I have started sharing graphs of the linear acceleration of my sessions with the tag #StillStanding on Mastodon. However, I wondered if this is the sensor data that best represents the motion.
December 30, 2022
Adding Title and Author to PDFs exported from Jupyter Notebook
I am doing some end of the year cleaning on my hard drive and just uploaded the Jupyter Notebook I used in the analysis of a mobile phone lying still earlier this year.
For some future studies, I thought it would be interesting to explore the PDF export functionality from Jupyter. That worked very well except for that I didn’t get any title or author name on top:
Then I found a solution on Stack Overflow.
August 7, 2022
Analyzing Recordings of a Mobile Phone Lying Still
What is the background “noise” in the sensors of a mobile phone? In the fourMs Lab, we have a tradition of testing the noise levels of various devices. Over the last few years, we have been using mobile phones in multiple experiments, including the MusicLab app that has been used in public research concerts, such as MusicLab Copenhagen.
I have yet to conduct a systematic study of many mobile phones lying still, but today I tried recording my phone—a Samsung Galaxy Ultra S21—lying still on the table for ten minutes.
February 16, 2022
Completing the MICRO project
I wrote up the final report on the project MICRO - Human Bodily Micromotion in Music Perception and Interaction before Christmas. Now I finally got around to wrapping up the project pages. With the touch of a button, the project’s web page now says “completed”. But even though the project is formally over, its results will live on.
Aims and objectives The MICRO project sought to investigate the close relationships between musical sound and human bodily micromotion.
November 13, 2021
Releasing the Musical Gestures Toolbox for Python
After several years in the making, we finally “released” the Musical Gestures Toolbox for Python at the NordicSMC Conference this week. The toolbox is a collection of modules targeted at researchers working with video recordings.
Below is a short video in which Bálint Laczkó and I briefly describe the toolbox:
https://youtu.be/tZVX\_lDFrwc About MGT for Python The Musical Gestures Toolbox for Python includes video visualization techniques such as creating motion videos, motion history images, and motiongrams.
August 27, 2020
Why is open research better research?
I am presenting at the Norwegian Forskerutdanningskonferansen on Monday, which is a venue for people involved in research education. I have been challenged to talk about why open research is better research. In the spirit of openness, this blog post is an attempt to shape my argument. It can be read as an open notebook for what I am going to say.
Open Research vs Open Science My first point in any talk about open research is to explain why I think “open research” is better than “open science”.
November 29, 2019
Keynote: Experimenting with Open Research Experiments
Yesterday I gave a keynote lecture at the Munin Conference on Scholarly Publishing in Tromsø. This is an annual conference that gathers librarians, research administrators and publishers, but also some researchers and students. It is my first time to the conference, and found it to be a very diverse, interesting and welcoming group of people.
Abstract Is it possible to do experimental music research completely openly? And what can we gain by opening up the research process from beginning to end?
May 30, 2019
RaveForce: A Deep Reinforcement Learning Environment for Music Generation
My PhD student Qichao Lan is at SMC in Malaga this week, presenting the paper:
Lan, Qichao, Jim Tørresen, and Alexander Refsum Jensenius. “RaveForce: A Deep Reinforcement Learning Environment for Music Generation.” Proceedings of the Sound and Music Computing Conference. Malaga, 2019.
Download
The framework that Qichao has developed runs nicely with a bridge between Jupyter Notebook and SuperCollider. This opens for lots of interesting experiments in the years to come.
January 25, 2019
Testing reveal.js for teaching
I was at NTNU in Trondheim today, teaching a workshop on motion capture methodologies for the students in the Choreomundus master’s programme. This is an Erasmus Mundus Joint Master Degree (EMJMD) investigating dance and other movement systems (ritual practices, martial arts, games and physical theatre) as intangible cultural heritage. I am really impressed by this programme! It was a very nice and friendly group of students from all over the world, and they are experiencing a truly unique education run by the 4 partner universities.