Arqade Asked by Alan De Smet on August 9, 2021
How can I run old games that insist on a lower resolution (say, 640×480 or 800×600) when my display refuses to run at those resolutions?
Most such games will run windowed, with works, but is a bit of a shame. It would be nifty to get it scaled up to full screen, with columnboxing if necessary.
More seriously, some games simply refuse to cope. Sam & Max Season 1 will run windowed, but wedges when I try to open the options. So because it doesn’t like my display, I can’t change the audio mix.
My searches so far have only turned up GameCompanion. It’s not clear that it will work, and the unusual distribution channels (Bethesda RPG mod sites and Facebook) make me a big uncomfortable.
Edited to add: I have an Nvidia card, and it turns out that it will let my add “custom” resolutions. 800×600 worked, 640×480 didn’t. It’s documented here, but the current interface looks different: http://www.nvidia.com/object/custom_resolutions.html Unfortunately this isn’t a general solution; it didn’t work for 640×480, and I don’t know if it works for other video cards.
Maybe running it in windowed mode would help, not sure though. To run something windowed, create a shortcut of your exe file. Go to the properties of the shortcut (right click). In the path add " -window" without the quotes ("").
Answered by Valentin Grégoire on August 9, 2021
You could try setting your monitor scaling mode to 'No-Scaling' via your GPU control panel (this option at least exists within Nvidia Control Panel, not sure about AMD personally), which would mean that when you run a program at a particular resolution your screen would render it is if it were native, and the program in question would be in-set in a smaller window with black borders. This might get around your problem, or at least let you access the options menus in-game.
Alternatively, if it is only a problem with changing the resolution, have a hunt around for the configuration files for the game (e.g. in your My Documents folder), as mostly these are human readable so you can modify settings without needing the options in-game.
Answered by Mechorpheus on August 9, 2021
Warning: this post is loooooooo- wait for it -ooong. These are hints and tips from my collective experience from hooking up different (not always compatible or intended) displays to my Windows machines, old and new alike, at different (most typically not supported) display modes and resolutions. I'm posting this for posterity's sake, and for Google users as well. Also, please keep in mind that these are not sure-fire methods. There is a possibility that all of these will fail, in which case you're simply - out of luck.
Please be advised that some solutions provided here are advanced-tech-stuff and/or hacks, and can potentially make you unable to use a display, or - in extreme cases (though I know of exactly 0 such cases) - could damage your display and/or card. Therefore all solutions are provided as-is, and I take no responsibility for any damages resulting from applying (or misapplying) them. You put these instructions/hints into practice at your own risk.
Please note that I am referencing two particular software products: PowerStrip and UltraMon. These are paid-for programs, and I am in no way affiliated with them or their makers, nor do I receive anything for any endorsement of any kind. I use them because they're mature technology, so they're reliable (well, UltraMon is, in any case). You may find alternatives to them at your leisure.
And as with everything, your mileage may vary. Let's begin.
I am assuming you have an LCD (or some other non-CRT-based) display as a monitor. These are generally a right pain in the proverbial behind, because they have a fixed 'native' resolution, and drivers apparently have been made to keep yours there. CRTs (the big, bulky ones) shouldn't have a problem with going low-res.
Please note: not being able to go low-res (800x600, 640x480) may be caused by either Windows not letting you, or the display not letting you (or both, in fact). Windows XP and above will, by default, not allow 640x480, but at least Windows XP can run at this resolution (and even display a baloon tip that it's below the minimum recommended, which is 800x600). If it's Windows, the problem can be solved with software. But - if the display itself shuts off when you try to force the low resolution, it may be not accepting it on the hardware level, in which case the only solution is to use a different display.
My solutions to the problem:
-- If you are on a PC (a stand-alone machine) - two possibilities
I) you have an old CRT display (the big and bulky kind) around somewhere [this is the fastest solution]
The fastest way you could do what you want might be to find an old CRT laying around somewhere and hook it up. It may be a pain to switch between the displays, but hey - you're already doing something not expected nowadays.
Windows may not allow you to change into 640x480, but the game may be able to run in this mode without so much as a whimper, on a CRT. Many fullscreen applications have ways to switch resolutions (if they're truly fullscreen, that is) above and beyond what Windows will allow. Have faith.
If that doesn't work, then...
II) you want to try forcing the display resolution (any display)
You could try using a program that forces custom resolutions, like PowerStrip. I myself use it to run a PAL TV via the VGA connector (with minimal extra custom hardware for sync compatibility). It requires 720x576 at 50Hz interlaced (with a REALLY low pixel clock, something to the order of 15kHz), and it works like a charm. This works equally well on Windows XP x64 and Windows 7 x64. I have three cards I've done this with, all nVidia - the GeForce 210, GeForce GTX 550 Ti and a GeForce 7600 GT. All this, while remembering that modern displays don't even KNOW what "interlaced" is.
Please bear in mind that this will require a lot of back-and-forth, a significant time investment, and will require knowledge to pull it off. (but - if you lack the knowledge, you will pick it up on the way, trust me)
Again, theoretically, you should be able to either add 640x480 if it doesn't exist (this should work about always), or override it if it exists already but isn't accessible (this may conflict).
If the game doesn't properly switch to the desired resolution after starting (which can always happen - using PowerStrip is basically a hack here), then alt-tab out of it and switch the resolutions manually via its notification area right-click menu. If the game doesn't do resolution refresh (as in, doesn't try to change the resolution when the display is already at the desired one), then switching the resolution and then launching the game should help.
If it helps at all, I don't use the stock nVidia drivers - I go with TweakForce (formerly Tweaks'r'Us) all the way. It shouldn't make much difference, but you may get lucky.
-- If you are on a laptop - two possibilities as well
I) you have an external VGA or DVI connector AND an old CRT laying around
You essentially connect it the same way you would to the PC, but there's a catch - you cannot disconnect the primary display as you would on the PC. It's built-in after all. And the thing about games is that they will always run on what is considered the primary display. The only way to have them elsewhere is to drag their stubborn ass... err, window, to where you want it. But you can't drag a fullscreen display mode...
What you want to do, is to make the external monitor the primary display. Many drivers (incl. nVidia's) will allow you to do this. That way the game will start on the external display. But there's a catch here as well - many games assume single-display mode and don't lock their controls down, which means that whenever you reach the display boundary w/ the mouse, it will "pop out" and the game will lose focus. Sometimes the game will behave as if alt-tabbed when this happens. All sorts of weird stuff can follow (incl. changing the display palette to 256 colors - happened to me more than once).
What you really want to do is to switch to single-display mode on the external monitor. That way it will be single-display (no boundaries to worry about) AND it will be the primary (the only one, in fact). If you can do this, it would be the holy grail here, but not all laptop display card drivers allow access to this function (I think many do not). However, UltraMon (or other similar piece of software) will come to your aid, as it has the ability to force a particular display off, no matter what the device driver says about this function's availability for the end-user.
PLEASE BE ADVISED! PowerStrip and/or UltraMon by default runs at startup, and can be configured (by accident or intentionally) to apply the default settings right at startup. This means your primary display (the built-in one) could be disabled after you log-in into Windows, and without the external monitor hooked up, you won't have a display at all. Of course, hooking up the external display will solve this problem (a reboot may be required), so it's no biggie. Just keep it in mind so you don't panic if this happens to you. Worst-case scenario - you can disable PowerStrip's (or UltraMon's) startup from the Safe Mode (msconfig is your friend), which will eliminate the problem altogether.
Also, if the problem is with PowerStrip, hard-rebooting the machine will force it to ask if you'd like to disable the default configuration as it will sense that the last time it was running, the computer aborted unexpectedly and it may be at fault here.
If you really don't want to fiddle with the external display, or if you don't have a port for it anyways (some laptops don't), then...
II) you want to try to force the resolution as-is (any display)
Exactly like in II) for the PC, but keep in mind that many laptop displays won't allow custom resolutions. (Intel's display drivers are notoriously bad with this, and it doesn't even end there, oh no...)
I realize that this isn't the easiest or most intuitive way to do this, but technology has moved on, and there are many things we today consider "useless", like the low-resolution display modes, or interlacing (at least on the computer and outside video editing/studio work). That's why it's always advised to keep an old box or two around for the nostalgia value - they're guaranteed to work with the old games. Just don't hook 'em up to the Internet, as this might kill them even before you sit back down from connecting the Ethernet cable (msblaster or sasser, anyone?).
It could have been worse, you know... If you wanted to play games on the really old, fixed-resolution displays (like for the Apple Lisa - or the TV I mentioned...), you can't do it with software alone. You'd have to build some custom electronics (typically the sync signal issue, either composite sync or composite sync-on-green) to get them working.
Also please keep in mind that these hints and tips will NOT work on Apple computers - OS X dynamically manages displays, so even turning one off (so as to watch a movie on the second one w/o looking at the desktop) will switch the displays around (which is annoying as all hell!). These tips are Windows-centric, but -will- work on Linux using different software (modeline is your friend). I do not know of any software/solutions for OS X, though they may exist. I was never annoyed/bothered enough to look for them. (also, I think games on OS X won't need these workarounds anyway, but there's always the oddball case...)
Answered by Egon_Freeman on August 9, 2021
Right click on desktop and click on Screen resolution --> Advanced settings --> in the Adapter section click on List all modes --> select your required resolution(it also ranges below the minimum resolution).
From here you can select 640x480 True color etc, after which your resolution will get changed. (But I am not sure even after changing the resolution your app will work or not).
Answered by Kanad Chourasia on August 9, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP