-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[console] Add dual monitor support to console-direct driver #1980
Conversation
It should be noted that the addition of this new stuff to
|
Wow @vkoskiv this is really amazing! Great work!! Give me a bit to find some time to see how you've made this work, but I'm super impressed with the screenshots. And I have to say, running I haven't had but two seconds to look at the code, but I'm seeing lots of C->op->var. The double indirection all the time is expensive. How about the idea of just including the The above suggestion would also help for adding a config option for this, as I think the extra 1.7k bytes is too much to force on people that don't have multiple monitors, and, in some configurations, we're within about that much space before running out of our near kernel CS address space! The extra data required in struct Console could be ifdefed out when not configured, along with any unneeded routines. I'll get back to you shortly with more input, thank you for this amazing contribution! |
Ah! I had a hunch that that may be the case as I was adding those in, I just didn't look at the generated code yet :] |
Great - please go ahead with that and push another version after you have that working. I should hopefully have more time soon. If you want to take a pass at a configuration option, a very simple method is to define a file-local config option, say CONFIG_CONSOLE_DUAL, that is defined near the top of console-direct.c. Then you can play with adding (or not) the struct Console variables right during development, and we can then afterwards put the config option into the appropriate place inside the configuration management system. Turning the config option on and off within the file allows for a simple comparison of text/data sizes. BTW, for that, I would use |
For that, try |
Now that I see you've gone ahead and implemented this for many of our various console*.c drivers, a better place for the temp config option for the moment would be in console.h.
Sure, you can do that provided it doesn't get messy, or just ifdef it out entirely when CONFIG_CONSOLE_DUAL is defined, your choice. |
Wow!! This is amazing and very cool! console.c is used in pc-98 too so let me know if there is build errors:) |
Seeing the two apps simultaneously is priceless! |
Yes, while the ELKS kernel overhead in processing tons of ANSI escape sequences emitted from @vkoskiv, you're welcome to run |
Hello @vkoskiv, I've been thinking more about some ideas for the possible minor (re-)implementation of this PR. As mentioned above, we definitely don't want to use double indirection. And we want as much speed as possible, but certainly the "fixed" values of things like Since struct Output would be going away, we really only need as may struct Consoles as there are max consoles * monitors, which is 6, and possibly only 4 if MDA hardware doesn't support multiple pages? So we need /dev/tty1 through /dev/tty6, or through /dev/tty4 for a single MDA page. Each struct Console basically maps directly into a /dev/ttyN, just the ordering of the mapping may differ between hardware installed: In the MDA-only case, /dev/tty1 (though N when more pages) would be allocated, so systems programs can always assume a working /dev/tty1. For the EGA-only case, /dev/tty1 through 3, just like it is now, and for the multiple monitor case, map EGA to /dev/tty1 - 3 and MDA tty4 onwards. This may be how you have it designed now. For the For EGA+CGA, the EGA consoles would be allocated first, then CGA. For VGA, ELKS treats VGA as EGA and doesn't support any VGA-specific features, so its probably not worth bothering to add options to ELKS to use special (non 0xB800) hardware address for EGA/VGA. I am assuming that your initialization routine can basically differentiate between hardware video memory at B000, B800 and A000 all in some kind of text mode. Are you relying on the BIOS to initialize each monitor's text mode? I haven't looked at that part yet. I haven't fully studied your code, so some of what I've suggested may not quite be the way you've implemented things, but thought to discuss this more before/while you're refactoring the code. Thank you! |
Hey @ghaerr ! Took a few days longer than expected to get back to this, as I fell ill with the flu the very next day after opening this PR. Feeling much better now, so I'm excited to get back to working on this!
Agreed, I'll change those back.
The max amount of consoles possible, as far as I can tell, is 5. We can't have multiple CGA cards in the system, as their CRTCs and vram conflict. CGA should support 4 console pages, but
The way that it's designed now is that we ask the BIOS to tell us which display the user has selected as the 'primary' one. This happens with switches on the IBM 5150: and I'm assuming BIOS options on more modern motherboards. So
Side note, I remember trying the Nano-X stuff earlier, and as far as I can tell, it's not intended to work with CGA or MDA? In any case, I propose a check in the
Right, so I'm just dealing with the specified video memory addresses and CRTC I/O bases.
No, so the primary source of 'bloat' coming from these changes is that we now have logic for setting up video hardware, as we cannot rely on the BIOS to initialize anything but the primary one specified by the user. |
Thanks!
I've taken care to update |
The bloat of
So 464 bytes smaller, but still 1248 bytes of bloat when compared to how it was originally. I'd expect gating with |
@ghaerr What should we do about the spinning I/O indicator that lives in It relies on the |
Cracked open the 5150 to test this with the MDA card selected as primary, and it works as expected. The assignment ends up as:
|
Having the same member names in struct Console will help, thanks. For those (new) names in struct hw_params, the new member names are probably better. This will be more descriptive and keep the HW changes only in console-direct.c, whereas console.c is shared between the other console drivers/platforms and their member names not changing will help.
Great, probably just as well to starting gating right away, some of my comments will mention more detail about it.
It was probably done a long time ago for space reasons. There are other complicated issues though, for instance the "fake" FAT filesystem /dev emulation is limited as to how many /dev entries there can be. So for now keeping /dev/tty1 - 3 as CGA is a good idea to reduce complications that can be dealt with later.
Ok. What BIOS Data Address (BDA 0x40:XXXX) is being used for that info?
Yes, sounds good.
@tyama501 just added CGA support to Nano-X. MDA is not supported.
Lets not change the logic yet on
Got it. That's fine, as long as all hardware-specific logic is gated. That will ensure full boot compatibility without the option. (It has taken us a while to achieve booting on almost any 8088 system, which is why we're using only a single BDA video mode for the current startup).
Great!
Perhaps leave a VideoSeg global which is set unmodified outside the gate, but set to the video segment of what your new code determines the primary monitor is when CONFIG_CONSOLE_DUAL is defined. I have some other comments on the last commit, will comment directly on the code for that. |
721ddfd
to
9bc13b4
Compare
If the rebase approach is okay, I could apply the rest of the fixes like that, too, so we'd end up with 2-3 good commits, and no more extra fixup/feedback commits. Both ways are fine by me. |
I think we're OK with the rebase commits, but git is not my forte (nor do I want it to be!). Agreed there's been probably too many fixup commits to be of much interest to people, so a rewritten history is fine by me for this PR. I plan on either a squash or merge commit at the end of this; a merge commit would be nice if you want to keep a few good commits showing changes. The Width/Height/MaxRow/MaxCol rename is looking good - it is showing exactly what i wanted to see and making it easy to ensure no bugs. The big question now is whether to optimize out Width/Height, or MaxRow/MaxCol. I want to take a slightly closer look, but it seems that if you went with moving MaxRow/MaxCol to locals variables, it could save quite a bit of code generation in the shared console.c routines. As far as speed goes, either way won't matter with the Scroll/Clear routines etc, as they take way longer than what we're concerned with. |
@vkoskiv: I just reviewed our large diff. With regards to the Width/Height/MaxCol/MaxRow we've been talking about: I say you should go ahead and remove MaxCol and MaxRow as members of the Console struct, and replace them at the top of any function which needs them with a local variable of the same name computed from Width or Height. This should lessen the amount of code produced in console.c::AnsiCmd() by quite a bit, and frankly, Width and Height are much more commonly understood than the zero-based equivalent MaxCol/MaxRow. This method also allows us to ensure porting errors to be at a minimum, as the diff will be simple to inspect. And if we're lucky, they'll be some savings as a result of the to balance the added bloat when CONFIG_CONSOLE_DUAL isn't defined. What do you think about that? |
f194307
to
1c4e2c6
Compare
@ghaerr Sounds good to me! I have a patch ready for it, I'll just do some checks to ensure I didn't break anything. In the meantime, I addressed most (all?) of the other feedback, mostly as fixup commits to the existing commits. |
Ok @vkoskiv, glad to see you back and get this finished up. Thanks for all the changes. I'll do another code review and then a hopefully final one after the MaxRow/MaxCol elimination. I am anxious to learn how much the code size changes up or down for the console driver when the DUAL option is not selected, given some of these later optimizations. |
Don't worry about that if your time is short - I'm happy with the commits you've pushed and plan to do a merge commit. |
Thanks to some clever design from IBM folks, even the original 1981 IBM PC had support for more than one display at a time by 1983. The CRTC I/O base and vram segments don't always overlap, so nothing prevents the system from addressing two compatible adapters simultaneously. The switch on the PC motherboard just tells the BIOS which output it should set up on boot. This patch introduces the concept of an output to the console drivers, and adds support for probing, setting up, and managing multiple display adapters to the console-direct driver. There are no new options. The adapter indicated by the BIOS as the primary adapter is set up as the primary display, and a potential secondary adapter will be set up as a secondary display, if present. Depending on the runlevel, each display can get 0 or more consoles allocated to it, and switching between them works the same as before, by using CTRL+ALT+F<N>. To indicate which display is active, the blinking cursor is toggled such that it only appears on the active display.
Since we can now have 4 consoles up at a time (1 on MDA + 3 on CGA), bump up the amount of ttys available. I think CGA could do 4 pages on its own too, but we have to draw the limit somewhere. 3+1 feels nice and even :]
The init logic in console-direct was much reduced, too. We also now test vram before enabling a card before clearing it, to ensure it's all there.
- Moved struct members for better alignment - Replaced enum with #define - Removed type_string() function
1c4e2c6
to
f0abea8
Compare
All of this is looking very nice indeed! The removal of C->MaxCol and C->MaxRow really helped make the diff smaller, which definitely helps regression testing via eyeballs. I am going to fly my drones for a bit but will take a final look after and hopefully get this committed today :) The build is failing because crtc_init and crtc_probe are undefined - please test without CONFIG_CONSOLE_DUAL defined, that's likely the problem. |
f0abea8
to
117c196
Compare
I've done test builds with different sets of options, and it builds fine on my system with |
Try running |
I can repro with |
117c196
to
9e86008
Compare
A new config option is now available, CONFIG_CONSOLE_DUAL Disabling it makes `console-direct` behave like it did before, spawning 1-3 consoles depending on the primary display type. Also: - Renamed params -> crtc_params - Renamed init_output() -> crtc_init() - Renamed probe_crtc() -> crtc_probe() - Marked crtc functions as INITPROC
.bss is already zeroed, so we don't need these extra zeroes taking up space in the kernel image.
9e86008
to
dba7eb1
Compare
I changed the |
@vkoskiv: You've done a great job on this, thank you for the wonderful enhancement! I'm going to commit this now. I will follow up with a PR to make the change from unsigned short to int for Width and Height, with a few other minor changes, as well as an answer to your above question. I'm anxious to see what the kernel code and data size change for this with DUAL on and off. We can continue discussion on that PR if needed. Thank you! |
[kernel] Updates to console code from #1980
I wanted to use two displays at the same time on ELKS, so I rewrote most of the
console-direct
driver to make this happen.Still very much a WIP, but this is already working quite nicely on my hardware.
IBM folks were clever enough to ensure that I/O ports and vram segments didn't collide between the MDA and CGA cards, so both could be used at once.
Not too many programs in the DOS era supported this. Lotus 123 and Borland's wonderful Turbo Debugger are good examples.
Now ELKS, with its full preemptive multitasking can take advantage of this feature, for some really cool dual-terminal UNIX sessions on even the most ancient IBM hardware.
This PR introduces the concept of an output to the console drivers, and makes use of that in
console-direct.c
, which now has logic for probing, initializing and managing multiple output devices.Switching between displays is fairly straightforward - Each display gets allocated 0-N consoles, and the active display is indicated by hiding the cursor on the inactive display. Switch with
CTRL+ALT+F<N>
, like before.Few more things require attention before merging:
VideoSeg
global, which is no more. I'll probably just hook that up so the indicator is displayed on the primary display. Thoughts?Below are some photos of the thing in action on my hardware.