PC to NDS video streaming kinda thing [1]
During this summer, me and my cousin (who knows more about computers than me despite being younger) decided to start a project together. He had the idea to stream video from the pc to the nds, and to send the inputs from the nds to the computer, in order to be able to play games on the nds via streaming. The idea was to be able to play a 3ds game on an emulator, and stream it to the ds, wich would be pretty funny.
In order to achieve this we had to be able to capture video on the client side on the pc, and then send that to the server running in the ds, that would then render the image and send back the inputs.
My cousin didn't know much about network programming, and I'm still learning about graphics, so we kinda split the work that way. He would handle how the client reads images from the screen, and I would handle how the communication works and how to display the images on the ds screen.
All the testing in real hardware would be done on this beauty:


- Nintendo DSi XL Chocolate -
For the server on the ds I used libnds which is included on the devkitpro toolchain. I really recommend devkitpro if you're looking for developing any kind of homebrew program for a nintendo console.
Looking through the libnds examples, I found out that you can use sockets like you would use them in a normal c program in linux, so using the examples and going through my assignments of my last year's computer networks class I setup a very simple test client and server. I was able to make a very simple program that showed the console's local ip address and waited for messages. Then, on the client, you specified the ip address and port and you can send messages wich will be displayed on the bottom screen.
I don't have much footage because I wasn't planning on writting about this, but here you can see how it waits for a conection:

Now we needed to decide how we were going to actually do the video streaming.
Our first aproach was to just use the 2d engine and render a texture as a background. Then, we could recieve the color of each pixel and modify the texture acordingly.
We needed as much speed as posible, and the loss of packages wasn't a concern because the next one will come eventually, so we decided to use UDP sockets.
I was able to send some pixels to the ds and have them rendered, but here is where it starts getting complicated. It turns out that the ds can't receive packages over roughly 1kB or 1,5kB, and the speed at wich it processes the packages is pretty slow too, so our method had to change.
We decided to use a system in which we send the first image, and then instead of sending more images, we send the diference between the two frames. For example if a pixel has in frame A an rgb value of (50,50,50), and on frame B of (50,50,100), we would send (0,0,50). This is very useful, because when there are few changes from frame A to frame B, there are few pixels with important data, and the rest are all 0, so it can be compressed to a lot less bytes, and we can send more frames with the same bytes.
We are using UDP, so packages will be lost. To prevent too much corruption, we will have to send a normal image once in a while.
I still have to think about how will I manage all of this, and find out if the ds can handle it, but meanwhile I'm also looking at how will I render the data we recieve.
libnds provides a graphics API pretty similar to opengl to which I'm relatively familiar, so the plan is to have a quad with a texture, and modify the texture manually writting to the ds video ram. This way we can change the resolution of the images we're rendering on runtime.
I'm still strugling to have a quad on screen so I'll have to keep working on it. I'll write another one of these when I'm more advanved on the project. Btw meanwhile my cousin has already managed to capture video from the pc, and transform it to any resolution. You can even decide if you want to capture the whole screen or just a window. I'll have to catch up, so I'll get to work... soon... probably...
Thanks for reading :)