A framework for musical software-development and analytics on mobile devices.


More people are making music with mobile devices every day. Additionally, more and more musicians and media artists are realizing the potential of mobile app development in providing an expressive format for their work and that of others.

MusicianKit stems from an interest in understanding not only what kinds of music people are making with their devices, but how they are making it and how they are being empowered to make it by these devices. It's a framework, with a target audience of musicians and composers (including theorists, musicologists, etc. - anyone who works with music) for writing musical applications in an expressive, musical manner.

In addition to this, it acts as a system for distributed music analytics, in the sense that it is intended to act as a carrier for information about music made by musicians at all levels of training, so as for it to be meaningfully used by theorists, musicologists, other researchers, and ultimately by developers of new technologies, to better understand their users' desires and track trends in mobile music-making.

Platform & Architecture

MusicianKit is written in Swift 4, and is intended for Apple's family of iOS devices primarily, however because it doesn't rely on specific UI frameworks, it might be easily ported to macOS, as well as other platforms, given the open-source nature of Swift with Foundation, though this has not been implemented yet.

As is often the case with musical applications, iOS was chosen because of its popularity among music enthusiasts and makers, given its audio performance. Additionally, Swift was chosen rather than Objective-C because this is intended to be an open source collaborate project, and the rising popularity of Swift and that its reputation as a less intimidating language to learn were a good match for the goal of facilitating contributions from musicians.

Modules & Usage

Below are a few examples of various modules or areas of functionality (in various degrees of completeness) for which APIs are provided.


// Initialize and perform operations on pitch-class sets
let pcset0: PCSet = [0, 2, 3, 11] // PCSet literal
let pcset1 = PCSet([67, 63, 61, 69]) // Returns a PC Set with [7, 3, 1, 9]
let pcset2 = PCSet("4-2") // Using a Forte code
let pcset3 = PCSet(4, 3, 1, 0, 2, 5) // Using a variadic initializer
print(pcset1.getPrimeForm()) // Prints [0, 2, 6, 8]
pcset1.transform(t: 3, i: false) // Mutating transformation to [10, 6, 4, 0]

// Build a twelve-tone matrix from a ToneRow literal (returns an initialized ToneMatrix)
let matrix = ToneRow([1, 2, 7, 10, 4, 5, 9, 11, 8, 6, 3, 0]).buildMatrix()

// Alternatively, initialize a ToneMatrix with a ToneRow literal argument
let matrix = ToneMatrix(row: [1, 2, 7, 10, 4, 5, 9, 11, 8, 6, 3, 0])


// Returns an array of semitonal offets-from-tonic for a mode (e.g. Modes.Major.first.offsets -> [0, 2, 4, 5, 7, 9, 11])
let lydian = Modes.Major.fourth.offsets
let alsoLydian = Modes.Major.lydian.offsets
let phrygianDominant = Modes.Minor.Harmonic.fifth.offsets
let mixob6 = Modes.Minor.Melodic.fifth.offsets

// Create a custom scalar structure and retrieve it or replace it (or a mode of it) at any time
let myMode = Mode("flixodiddlian", [0, 2, 3, 5, 6, 7, 9, 11])
// In some other scope
let flixo = Modes["flixodiddlian"].offsets
let flixo2 = flixo.getMode(3)


// Perform a parallel transformation on a triad
let nextChord = TransformationalTools.transform((.C, .major), by: .P)

// Check for a single simple transformational mapping between two chords
let transformation = checkSingleTransformation(from chord: (.C, .major), to otherChord: (.A, .minor))


// Check chord parity (considers inversional equivalence)
if chord1 == chord2 { /* do something */ }

// Parse two common styles/systems of Roman Numerals
let chord1 = RomanNumeral.Traditional("iv64")
let chord2 = RomanNumeral.Berklee("bVImaj7#11")

// Voice-lead smoothly between two harmonies (returns [Int] of MIDI note numbers)
let chord = [60, 63, 64, 67, 70]
if let nextChord = Chord("Fmaj#11") { // Note that the return type of Chord(_ chordSymbol: String) is Chord? (Optional<Chord>)
	let nextChord = Chord.voiceLead(from: chord to: nextChord)


// Classify a sequence of MIDI-note-number-style doubles by the EDO they seem to imply (returns 72 below)
let EDO = XenharmonicTools.midiParseEDO([60, 42.33, 20, 44, 51.167, 90])

// Get an increment by which to generate sets of pitches in some arbitrary EDO. Returns a double (e.g. 0.5 in the example below)
let offset = XenharmonicTools.EDOGetIncrement(24)

Many more usage examples (and Swift playground) to follow.


As the process of making music is increasingly democratized and distributed through the proliferation of powerful and portable multipurpose devices, there is the question of how to effectively observe trends in these contemporary music-making formats, and use this information to create systems that better serve and enrich the musical lives of their users, and also introduce meaningful new musical and sonic experiences.

Additionally, as more and more musicians learn to write code to further their artistic work with many of the amazing new tools and technologies that exist to such an end, tools are also starting to appear that enable an easier and more musical learning experience (Sonic Pi, TidalWaves, etc.), as well as others that act as mature investigative interfaces for musicologists and theorists (music21).

MusicianKit is one approach to these two domains of exploration, in that it provides an API for developers to integrate musical analytics for them (and for developers in general) to better understand how their systems are being used, and in that it relates to areas of typical musical study so as to enable writing code related to musical systems expressively and, ultimately hopefully, musically.

Current Limitations & Future Plans

MusicianKit is currently a relatively small and modular framework, as it relates to certain kinds of musical investigation. My hope is that other musicians and musical domain experts will contribute new modules that relate to their work and interests, perhaps building on what's already there or adding new ways of working, or porting it to other platforms of interest, etc.

Additionally, I'm hoping that musicians and musical people operating in non-western musical cultures and traditions will have insight into creating interfaces to interact with their musical languages meaningfully and expressively as well.