Provide the same approach to MIDI as on the Play, where you can switch between MIDI and Audio views to manage each.
What is the problem?
The Tracker only allows 8 channels of audio in total, where the Play can do 16 (8 audio, 8 midi). Would sending 8 channels of midi data be technically feasible? It shouldn’t greatly increase CPU demand as it is only sending midi data and isn’t doing any audio processing which is what really eats CPU power.
What do you want to achieve?
The same approach as with the Play where you can have 8 audio channels and 8 midi channels operating at the same time, thus giving you 16 channels of audio (arguably more through polyphony on midi channels). Having parity between the two products allows you to explore so much more between them. It also greatly expands what is possible with the Tracker.
Oh! Certainly good to know it wasn’t just me thinking along these lines!
I assume if the request was closed it must be because it simply isn’t technically possible or something, but would be good to get confirmation on that. Really think it could change how the Tracker gets used if it was do-able though.
Something I wanted to add to this: I tend to find that when it comes to combining the Play and the Tracker, there’s an assumption that the former would be controlling the latter. I think there’s also a case for using them the other way around, with the Tracker controlling the Play as a rhythm section, for example. Arguably that may be possible now, but I just think that with the expansion to 8 audio and 8 midi channels, you would open up all manner of ways to use the Tracker - including the Play across 8 midi channels (for example) and then various samples triggered locally on the Tracker.
That’s what I love about both devices; there’s overlap, but sufficient difference too that combining them can massively extend the possibilities here.