hmmm… I might have a solution… I’m using the drum tracks 11 - 8 (the ones at the top of the music editor) for defining 4 rows of falling notes, and I have plans to use a normal instrument for defining long dots, but the rest of the drums ( 7 - 0 and 23 - 12 in the bass clef ) could call… well… whatever I want them to call…
If anyone wants extra visual effects or maybe physical (changing how notes work somehow) please respond with ideas! I have an idea where, mid song, all the note rows could collapse into one in the center, and all the notes can be hit using any key. Also color changing effects too! I could literally make a custom function that you can “call” mid song that can do whatever you want, even stuff like fade in and out volume, color palette, etc!
This game (and that effect idea) is heavily inspired by the game Milthm, a rhythm game that’s free on Steam! It’s really cool looking, and it’s more for people playing on an iPad than for keyboard players (me), but I still really enjoy the lower difficulty levels! Anyways, that was a bit of a side tangent. In other news, how about I make this even longer by adding a devlog! Whoo!
Devlog! Whoo!
Ok, I found something very cool that makes my job super easy, at least when it comes to the drum tracks. I have no idea if this is the case for the normal notes… idk it probably is the case now that I think about it.
I have been coding in a way to play both a “music” track and a “dots” track at the exact same time, using the timekeeping system of the music track to also call the code that “plays” the dots track system, so no matter the lag they can’t possibly get too far apart! There is still that 1000 ms delay between the song track “playing” the sound and the sound actually happening, so if somehow that ended up taking longer or shorter, or the dots (which aren’t sprites because sprites suck) did something weird (even though they move each frame based on the milliseconds passed since they spawned, so them taking longer or shorter to fall shouldn’t be possible) it could end up with a tiny bit of desync, which I think is alright because idk how I would fix that. Playing on hardware might be interesting… I have hardware… hmm…
The way I’m “playing” two tracks is by editing everything so I can pass in two song buffers to be played instead of just one in the fmusic.createSong function. The “f” stands for “fake” because it’s my fake music system, and because using a single letter made renaming the conflicting bits faster. Basically, at every place where a function is used to handle something for the normal track, I copy the code and make it handle the dots track too. This worked well and by the time I was making the previous devlog, I already had synced song and dots tracks working well. At the time, the second track was simply creating dots everytime a note was scheduled to play from the dots track and then just… not playing the note. Melodic notes and drums are two sorta separate things in the music system, so only playing when notes play and not when drums play, or vice versa, is a single property check (a single “if”).
The thing that made the drum tracks easy was that they call a “renderDrumInstrument” that takes in a single number representing which drum it is. Yep. Just a number. A single number that essentially tells the function which line the drum instrument is on. So yeah, I just copied the code that enters that number into the function, and now I can easily know which line a drum is playing on. (See the first paragraph of this post for the slightest idea of the possibilities I just unlocked!). If I haven’t got multiple falling note lines at the same time working before this gets approved… idk, maybe something happened to me in the next maybe hour at most it will take to code this now that I have it working… or maybe I just got distracted and forgot to update the link, who knows.
Side note, I got kinda confused at the start of this project because none of my functions were being called like they should be, like, the “renderInstrument” function wasn’t being called, but the music was still playing. That’s when I realized that there are two music players. One is the normal one that runs when hardware is playing the music, and the other one only runs when the music is played in the simulator, which is calling a bunch of C++ functions instead of the javaScript ones! (C++ is just a guess, idk)
@richard, does this second music player just for the simulator exist so that songs can play using a C++ version instead of using the javaScript? When I deleted it, the music sounds the same playing from the hardware version as it does using the sim version. Is it just more laggy in JavaScript or is there another reason for this separate version? Also, last time I had a music related idea on stream, I got you all sidetracked for the rest of stream. It was very funny, but would you prefer I ask code questions on stream or on the forum?