How does software apply getting more colours out of the VGA by changing the pallette during run-time (while the VGA is rendering scanlines)? Do they use some timing based on the ammount of pixels processed in some time (using the Horizontal and Vertical frequency and the ammount of pixels in the display resolution horizontally)?
So if a game wants to change the pallette or DAC every scanline, how would old software do this? How would it know it's time to change the pallette/DAC?