Composition and Studio Technology: Difference between pages

From EMC23 - Satellite Of Love
(Difference between pages)
Jump to navigation Jump to search
mNo edit summary
 
 
Line 1: Line 1:
== DAW Composition ==
= Software =
* Rapid Composer
* Kords
* Magenta


== Generative Composition ==
== DAWs ==
* Koan
* Ableton Live
* [[Generative Music]] ===Generative AI and DeepComposer
* Reaper
* Ardour


== Editors and Utilities ==
* Audacity
* Sox
*


* explore the AWS DeepComposer service.
== Instruments ==
=== [[Digital Instruments]] ===
=== [[Analog Instruments]] ===
=== [[Virtual Instruments]] ===


* Train a model
== Effects ==
* Delay
* Reverb
* Phaser
* Flanger


*(Get hands-on experience by training your own models. Begin to grasp how models work. GANs have two contesting neural networks. One model is generative and the other model is discriminative. The generator attempts to generate data that maps to a desired data distribution.
= Hardware =


* Understand your model
== [[Modular synthesizer]] ==
* Eurorack
* AE modular


* Examine how the generator and discriminator losses changed while training. Understand how certain musical metrics changed while training. Visualise generated music output for a fixed input at every iteration. Music Studio
== Mixing Desks ==
* Allen & Heath QU-16
* Prosunus
* SSL


* Music Studio gives you a chance to play music and use a GAN! First, record melodies or choose default melodies. Then use a pre-trained or a custom model to generate original AI music compositions. Train a model Understand your model Create a composition 1. Record a melody
== Drum Machines ==
* Drum brute
* TR-8
* TR-8 S
* Volca Sample
* Volca Sample +


* Using the AWS DeepComposer keyboard or your computer keyboard, record a short melody for input. 2. Generate composition
== Synths ==
* Roland JP 8000
* Roland 303
* Novation Basstation Rack
* Korg Prophecy
* Korg Prologue
* Korg Minilogue


* When you’re satisfied with your input melody, choose a model and then choose Generate composition. 3. AWS DeepComposer generates accompanying tracks
== Midi Controllers ==
* Push
* Push 2
* Beat Step Pro
* Launch Control
* LaunchKey 25


* AWS DeepComposer takes your input melody and generates up to four accompaniment tracks. 1. Choose an algorithm
==Others ==
* [[MIO Midi Network Manager]]
* [[RTL-SDR]]


* Choose a generative algorithm to train a model. 2. Choose a dataset
= Acoustic Space =


* Choose a genre of music as your dataset. 3. Tweak hyperparameters
=== [[DIY]] ===
* Bass traps
* Erourack cases
* Net Tape labels
* Electronics
* Fanzines


* Choose how to train your model.
=== [[Visuals]] ===
 
* Graphics
 
* Video
==SSEYO Koan Brian Eno used it. Where is it now? ==
* Collage
 
* Livestreaming
https://intermorphic.com/sseyo/koan/
* Set Design
 
* Album covers
== Dynamic Composition ==
* Merch
FMOD
 
== AI [[Deep Learning]] Composition ==
 
* [[Automatic Music Generation]]
* Death Metal
* Techno
 
== [[Game Music]] ==

Revision as of 16:06, 2 September 2021

Software[edit]

DAWs[edit]

  • Ableton Live
  • Reaper
  • Ardour

Editors and Utilities[edit]

  • Audacity
  • Sox

Instruments[edit]

Digital Instruments[edit]

Analog Instruments[edit]

Virtual Instruments[edit]

Effects[edit]

  • Delay
  • Reverb
  • Phaser
  • Flanger

Hardware[edit]

Modular synthesizer[edit]

  • Eurorack
  • AE modular

Mixing Desks[edit]

  • Allen & Heath QU-16
  • Prosunus
  • SSL

Drum Machines[edit]

  • Drum brute
  • TR-8
  • TR-8 S
  • Volca Sample
  • Volca Sample +

Synths[edit]

  • Roland JP 8000
  • Roland 303
  • Novation Basstation Rack
  • Korg Prophecy
  • Korg Prologue
  • Korg Minilogue

Midi Controllers[edit]

  • Push
  • Push 2
  • Beat Step Pro
  • Launch Control
  • LaunchKey 25

Others[edit]

Acoustic Space[edit]

DIY[edit]

  • Bass traps
  • Erourack cases
  • Net Tape labels
  • Electronics
  • Fanzines

Visuals[edit]

  • Graphics
  • Video
  • Collage
  • Livestreaming
  • Set Design
  • Album covers
  • Merch