• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Any fellow Linux users on here?

I just got curious and did a quick check, I'm still running X11.

echo $XDG_SESSION_TYPE
Same here. But then Mint 22.0 is much closer to Ubuntu 24 these days....finally using the same kernel instead of a much earlier one.
 
Last edited:
I didn't have any real problem using xorg until I got a 4k monitor. When scaling the display to 2x, the fonts are unbearable under xorg, but it works perfectly with wayland.
 
I didn't have any real problem using xorg until I got a 4k monitor. When scaling the display to 2x, the fonts are unbearable under xorg, but it works perfectly with wayland.
My monitor is 4K but I never bothered trying to up the resolution to that, perfectly happy with a 1080 screen.
 
My monitor is 4K but I never bothered trying to up the resolution to that, perfectly happy with a 1080 screen.
That's one thing that isn't clear to me....(no pun intended). That even with a 4k-capable monitor, that you can default to 1080p and that the resolution quality is still comparable to "native resolution" ? Is this true?

Or in reality are you looking at a somewhat fuzzy 1080p and settling for it ? I always thought if you have a 4K monitor you must default to 4K resolution for an optimal picture.

(I also prefer to keep my resolution at 1080p to preserve maximum performance.)
 
Or in reality are you looking at a somewhat fuzzy 1080p and settling for it ?
It certainly isn't fuzzy. I've got very keen eyesight and if the display isn't quite right it really bugs me but mine is just fine. That should show in the screen grabs I've posted in here.
 
It certainly isn't fuzzy. I've got very keen eyesight and if the display isn't quite right it really bugs me but mine is just fine. That should show in the screen grabs I've posted in here.
Yeah, post your screen captures....especially text. Let's see how it stacks up against mine.

I've always thought every monitor has its own native resolution, always the highest possible depending on the monitor. I have to assume you're getting visually shortchanged using a 4K at 1080p...but I've never actually been able to verify it.

Then again, can a screen capture really determine this visually speaking? LOL....I'm not sure.Plus you have to consider graphics compression of the .jpg format. o_O
 
Perhaps there is a difference in gaming? Not so much in everyday use in my experience too. I've used Linux on pretty much every type of device.
Yeah the main issue is that like...(and I assume the latest drivers fixed this) games that used DX12 running through Proton noticeably run (or ran) worse on Linux if you have an NVIDIA card vs an AMD one.

I can't do a comparison myself because IDK what games I have on Steam that use DX12 (and I don't even have an AMD system running on Linux to compare the performance to, much less any systems running Windows for a baseline VS Windows comparison) but I've read numerous posts on forums and discussion threads about Linux gaming where people brought up how like they'd have a DX12 game that - on Windows - might run at 100 fps on their PC but then on Linux, with the same settings, would only run at 70 fps because of the issues with NVIDIA.

Now of course, to me, running at 70 fps would still be fine I mean like I have a 144 Hz display and I honestly can barely tell if a game is running at 100+ fps or if it's running at 60-100 fps but I also do get it, even if you're fine with how it performs, it'd still be disappointing that the game is running worse on Linux vs Windows (even if it's not running at unplayable levels)
 
Yeah, post your screen captures....especially text. Let's see how it stacks up against mine.
screen90.jpg
 
Problem is, this is still a compressed graphic. The text in the graphic looking quite inferior to actual text through my browser and video card.

Too many variables in play to make a real comparison. But then with your printing background and eyesight I'm guessing you're looking at well-defined fonts rendered at 1080p.

Also that I default to a dark background and light text. Which for me renders far better looking fonts than black fonts on a white background.

Too bad though, I'd like to find some source that verifies that any 4K monitor can be set at 1080p and still deliver optimal image quality. It's the one thing that would keep me from getting a 4K monitor depending on what the reality of this situation really is....
 
Click on the image to see it in it's native size and although it's a compressed jpg it gives a reasonable idea of what I'm looking at.
Still fuzzy compared to my actual browser text quality. Which has gotten progressively better with each new distro version. Reversing your text and background *might* yield a different result.

I just turned off dark reader and even then my dark text on a white background<yuck> still looks more polished than your screen capture. But with any graphic it's limited to its compression, unless you can reduce it.

LOL...these days a white background is just excruciating to my eyes! :eek:
 
I suspect that running your 4k monitor at 1080p would usually have the same effect as running it at 4k while scaling the UI 2x. The difference would be if you wanted look at something high res like a 4k image. In that case your computer would scale it down to 1080p and then your monitor would scale it to 4k for display which would be lossy.
 
I suspect that running your 4k monitor at 1080p would usually have the same effect as running it at 4k while scaling the UI 2x. The difference would be if you wanted look at something high res like a 4k image. In that case your computer would scale it down to 1080p and then your monitor would scale it to 4k for display which would be lossy.
I tried playing a 4K video and had exactly the sorts of dramas you're describing. Not just poor image quality but also tearing and constant lag.
 
I tried playing a 4K video and had exactly the sorts of dramas you're describing. Not just poor image quality but also tearing and constant lag.
For me 1080p was when displays hit the point where it was "good enough" for most purposes. 4k lets me have very sharp text, which is nice, but it is not a game changer vs 1080p. It takes a lot more processing power to push 4k pixels, which isn't always worth it.
 
My only concern in this case is simply the diminishing number of monitors that default to a native resolution of 1080p. Still available, but it's like retailers want to sell only the latest and greatest from a gaming perspective.

But with a 1660Ti video card, I must mitigate it against higher resolution to keep fps as high as I can....at least in theory. Those "performance issues" you speak of.

Also that higher resolution with the same size monitor means looking at a smaller interface and text....which is hard on this old man's eyes. So for me 1080p works, whereas 4K may have "deal-breaking" problems in comparison.
For me 1080p was when displays hit the point where it was "good enough" for most purposes. 4k lets me have very sharp text, which is nice, but it is not a game changer vs 1080p. It takes a lot more processing power to push 4k pixels, so I am not surprised to hear about performance issues.
 
Also that higher resolution with the same size monitor means looking at a smaller interface and text....which is hard on this old man's eyes.
Me too. That is why having the feature to scale the UI to 2x size is important to me. That way it is very sharp and high resolution without everything being too tiny to use.
 
For me 1080p was when displays hit the point where it was "good enough" for most purposes. 4k lets me have very sharp text, which is nice, but it is not a game changer vs 1080p. It takes a lot more processing power to push 4k pixels, so I am not surprised to hear about performance issues.
That was pretty much my thoughts when deciding to stick with 1080p. It might be nice but not enough to quantify the resources required. And as Judge just mentioned, I'd have to rescale all my fonts and icons etc to suit my older eyes.

When I bought this monitor I wasn't specifically looking for a 4K monitor, but when I talked to the bloke in the shop about building a gaming computer he said "We've got these monitors on special at the moment..." so that's what I ended up with.

A 27 inch screen at 1080p is kinder to old eyes.
 

New Threads

Top Bottom