Although 4K TVs seem to have become cheaper recently, I wonder if people are actually excited about 8K quality. Now that both Sony and Microsoft have announced that their new consoles will support 8K quality, will this lead to 8K TVs becoming more popular and thus cheaper as well? For example, Sony is currently selling an 8K TV for $70.000 a piece, which honestly seems outrageous.
With some 4K supporting games already being 60+ GB in size, it's only going to get worse for games that support 8K - perhaps we'll be looking at 150GB in size per game, or even more? A positive consequence could be that more and more countries will start investing in glass fibre cables, enabling people to download such games in a few hours instead of several days. ISPs will hopefully start jewing people over and offer actual unlimited data plans without any data caps whatsoever.
What do you guys think? Is 8K something you're actually excited about?
I wasn't supportive of 4k and I wont be supportive of 8k. They're memes. Someday they'll be standard but it's just useless fluff.
Fuck no. I haven't even bitten on the 4k meme yet. It's just another needless "upgrade" that won't even be financially feasible for 75% of the intended userbase for another ten years
Will ultrawide ever be mainstream?
not really if you're not using some tv-sized monitors and then question arises why the fuck would you even use such thing
everything above 1920x1080 is complete shit on 24" screens as it makes everything to small and no, scaling is not an option as it also sucks ass. 27" is to big and 1920x1080 is also to low for it.
Fags say the same shit about everything all the time
well, for gaming on pc, it also will be good at 4k 24", if game devs put enough work into their games, its only the desktop and all internet pages that are still not ready for anything above FullHD
>being this much of a delusional console fag
I didn't even know what 4K is supposed to mean until I saw OP's image. I haven't been the type to fall for meme technology for almost a decade now.
Good luck with your 80" inch computer monitor then, nerd. :^)
Meme like 3D television. Not saying it won't happen in the future but they are just pushing it out early to fleece the early adopters. Like everything that has come out way ahead of when the market would be ready to financial support it, try again in another 10-15 years.
Have fun with your 480p potato poor fag
There's barely a demand for 4k, and being someone that does have the equipment for reliable 4k. It's really not especially important for anything that isn't actual work. It can slightly improve readability in games with sewage tier art direction, but at that point it's an issue of development not, proof of the necessity of 4k.
No 16K is the sweetspot IMO.
>3840 x 2160
Strictly, that's only UHD. True DCI 4K is 4096 x 2160.
Human eye can't see more than 100 pixels.
Most everything these days has proper HIDPI scaling.
> 27" is to big
Oh, you're a retarded shit.
i'm using 24" for atleast 10 years and it was fine all those years. i tried that 27" 1440p 144hz ips monitor from asus = it was to big and everything on it was too small on 1440p, and that ips panel sucked so hard with edge glow and light bleeding all around this 550 buck piece of shit.
GET IT RIGHT FAGGOT IT'S 10
I watch videos in 144p in a corner of my screen.
Games are fine at 1920x1080.
There barely is demand for 4k…
Find me one console that can run a game at min 60fps in 1080p 144hz without cheating on LOD and other gimmicks that isn't Tetris.
And no the latest ones can't.
>Console gaming will be the future of gaming after all.
Streaming consols, bait-kun.
I love it how anyone who says 4K is a meme is just admitting that they're an insufferable 3rd-world poorfag.
The 8k they say for both these new consoles is 8k OUTPUT. It just means that you can output true 8k through their HDMI 2.1 ports. It's for watching movies and shit. Not 8k gaming. The spec's leaked for these things months ago, if you think you're actually going to be playing a AAA title at 8k 120 FPS because "those words were used by microsoft!" you're just…idk. Wrong. You're just wrong.
People are expecting 8k gaming because the words "8k output" were used, but the xb1x was called an "uncompromising 4k experience" and the DEMO TITLE was run in 1440p checkerboard rendered.
So they made claims saying "uncompromising 4k experience" and didn't even get close. They mention the word 8k, and people are expecting it in their AAA games.
Sorry it's just not happening. Knowing what we know about these consoles, there's a good chance we'll get 1080p 120 fps, or 4k 60fps with some launch titles, and probably for the first year or two. After that, unless they do a mid gen refresh, I don't see them staying at that level. Unless games don't get any "better" graphically.
Oh and the reason 8k was included was because they probably couldn't NOT include it. Even last generations cheapest GPU's came standard with 8k output potential. Anyone who knows much about hardware heard 8k from both sony and microsoft probably said to themselves "yeah, duh" and waited to hear the next feature.
t. 800x600 Windows XP Gold Edition Brazilian
>There barely is demand for 4k
Most people don't even want 1080p anymore except poorfags but poorfags always lose in the end because people want better shit, not garbage
Only thing garbage about 1080p is it isn't 4:3
> ISPs will hopefully start jewing people over and offer actual unlimited data plans without any data caps whatsoever.
You mean this isn't the norm for you?
I download 12megabytes (not bits) per second, with no caps, and pay around 7 bucks a month. Guess it sucks to be American.
One of the most annoying shits that has happened throughout the last decade or so is consolefag terminology being so pervasive that now even self proclaimed "PC gamers" use it. Namely tacking on "p" at the end of the screen's height instead of referring to a resolution like a non-retard. The whole "progressive" or "interlaced" shit should only be used for broadcast TV. It's fucking 320x240 not "240p".
Actually, that's more of a boomer thing. Inches and "p" is all they know.
4K is pretty good for gaming provided the game supports that resolution and the textures are the proper size. Most games look like ass though cause there is always some textures that are low quality and stand out like a sore thumb.
Can you honestly not do a basic ratio calculation?
What about 1440x1080?
There isnt even demand for 4k.
Quite literally everyone has settled on 1080x1920. Bigger than that is pointless or the screen becomes inconveniently too big.
I think I had a cintiq around that resolution at some point, very nice monitor.
Bigger than that is pointless or the screen becomes inconveniently too big.
Here's one with boomer tier knowledge. He can't even comprehend that pixels can be made smaller even though he probably owns a smartphone.
It was 1600x1200, still a nice monitor
>Being this much of a drooling retard
Monitors still use interlaced and progressive, it isn't just a fucking TV thing moron
thats why i said pointless baiter-man, now go back to cuckchannel
Anyone who thinks 1080 is the only thing that matters is the same kind of faggot who used to think nothing would be better then 480
Well, if we're referring specifically to videogames, that all depends on the readability of the individual elements on the screen.
the only games that require 4k+ are ultra tryhard simulators like ARMA and Verdun where you sit in a spot and stare at foliage for an hour until a blade of grass moves so you can shoot at it
I thought I was a bit too agressive with saying that you can't comprehend that you can pack the pixels in a smaller screen, but apparently you ARE actually that stupid. Or maybe you can't comprehend the part where more pixels can display a more precise image, possibly without the need for antialiasing.
>getting mad because technology is getting better
lol, always gotta be the samefags who get all pissy and elitist whenever people actually want to improve shit and make it better instead of sitting around playing 8-bit garbage like a fucking hipster fag
Just like the current generation, and what John Carmack warned about, we're going to get 8k at 30 FPS, just like how we got 4k at 30 FPS. 1080p at 60 FPS is far, far better than 4k at 30 FPS.
To add insult to injury, I doubt there will be any 8k televisions with even a 100Hz refresh rate. Most 4k TVs today use a 60Hz refresh rate. Just how the fuck do they even plan on displaying 8k at 120 FPS on a television given no televisions with refresh rates high enough exist. Many TVs just lie to consumers about refresh rates but actually operate at 60Hz. Only top, top of the line TVs output 4k at 120 Hz.
YouTube embed. Click thumbnail to play.
>"technology is better"
whoa, so you're telling me you dont have 16 different 4k+ monitors hooked up to produce 64k EXTREME HD resolutions? what are you, some kinda poorfag? git rekt kiddo lol its technology
It's less about "Technology getting better" more about chasing something that's kind of meaningless for a severe performance detriment, especially if we're in the realm of consumer grade hardware.
I'm still doubting 8k at 30. Running a game at double the rez isn't quite 2x as hard, so we can't make a solid ratio, but 8k is 4x what 4k is. And 4k60 on these consoles is already going to be pushing it, I can't see them quadrupling the rez while only halving the fps. I Think the aim for these things is going to be 4k60/4k30 with 1080p being "unlocked" fps in games.
On a side note, that's a REALLY good thing because it means game devs won't be able to tie their physics to the FPS. Means games ported to pc will have an entire issue just gone thanks to the new console gen. So that's pretty cool!
I think there's a legitimate case to be made for 4k standard as an end to messy and aggressive anti-aliasing, but we're only just starting to see hardware that's capable of delivering that consistently without completely tanking performance. I can't really see much of a benefit to 8K unless you have an absolutely MASSIVE display - or possibly in VR headsets where your eyeballs are pretty much strapped directly to the screen.
YouTube embed. Click thumbnail to play.
First raytracing demo with actual gameplay
Where in the world, Latveria or Narnia? I don't see no fucking interlaced option in my settings.
Its a meme not really different from 3D television and motion controls. The benefits to the average user are negligible and the performance hit is massive. Plus the market is swarmed with 1080p monitors and TVs for fairly low prices, not so much the case with 4k or 8k stuff.
If the current consumer grade standard was a GTX 1080 then we could talk about 4K/8K being maybe possible(assuming of course that we would have the current graphical standard) but as things stand its mostly just a bragging point and not much else. And unless hardware prices dont take a nose dive that wont change.
4k legitimately looks better than 1080, sorry /v/bros.
There is always a desire for higher fidelity. Technology trickles down, and so will 8k. 4k is already cheap, and it got cheap a lot faster than 480 to 720, or 720 to 1080.
No shit retard.
Nobody's arguing that an increase of resolution isn't an increase in visual information.
Once it's cheap, I might pick up a monitor or something. I don't really care. I like 480i CRTs more than anything for gaming.
There isn't even a real demand for 4k.
Are you living under a fucking rock, mate? 4k is the new standard for TVs now. You can get 4k TVs for only a couple hundred quid now.
Well, according to Apple, for a decently sized PC monitor 4K to 6K is considered "Retina" which is a marketing term that does also have some level of merit behind it since its whats believed to be the resolution the human eye at 20/20 vision cannot distinguish individual pixels at a given pixel density.
However, for video games, the resolution needs to be much, much higher then 8K. You read that right, HIGHER. Our screens are not high resolution enough to properly resolve any 3D game
The reason anti-aliasing and level of detail management exist is because of temporal artifacting, which is caused by inaccurate sub-pixel rendering on edges. At high enough resolutions anti-aliasing and LoDs actually become unnecessary.
Observe pic related, all displays are based on a grid, at current resolutions, including 4K, the pixel density is not sufficient to properly resolve edges, edges become rendered to pixels on the display. The current solutions mainly involve rendering edges to subpixels which is less then ideal. This is why we need higher resolutions. Going to Level of Detail management, at too high a detail density the display controller has to constantly switch which subpixels are being rendered, causing flickering and other temporal artifacts. (This is hard to illustrate because of the temporal nature, find a game that lets you set negative LODs and you'll see what I mean.) Basically AA and LoD are resolution problems
There isn't even a real demand for 4k. Why they hell would there be a demand for 8K?
tbh this wouldn't be an issue if we all just used vector monitors.
I don't get 4k monitors, much less 8k ones. They'd need to be ridiculously big to actually benefit from the higher resolution to any significant degree.
You'll want to use LoD techniques anyway since it saves you processing power with meshes. You'll also want to keep using MIP mapping with textures since the texture will simply look better when tailored for smaller resulution rather than just shrunk one. Valid points with aliasing, but there already are plenty of AA techniques whose implementation is certainly much less costly than the cost of rendering so many more pixels.
Only content creators with 8K+ cameras are genuinely interested in it. Most of the people I know personally are still watching movies and playing vidya on 1080p displays of some kind (I do as well).
Nigger I just got a 1440p 144hz 27in and its perfect as far as I'm concerned. But this is about TVs, not monitors.
Aren't most of the popular digital interfaces have barely just started supporting 4k at ~120 Hz (creating an incompatibility problem with a lot of recent tech, especially since most of the display products are now mediacenters, which worsens bottlenecking even further)? Eitherway having 4k at a stable 60 is a problem even for the most recent high end cards and scaling a display to a reasonable size would increase the costs even more.
Still not as big of a meme as IPv6, 4k is probably going to become affordable in 5 years.
Also, I believe most of this also applies for VR, since those require output of at least 2 1080p monitors at at least 120Hz which roughly equals to 4k at 60.
Game sizes have nothing to do with resolution, it's mostly just uncompressed audio and only sometimes video resources. No, the problem with ISPs in US will not change for quite a while. One also does not require fibre to download terrobytes of data in any reasonable time and even then first bottleneck in that regard would probably be storage space eitherway. 8K is literally impossible within any meaningful scale of application. Prices for 1080p 144hz monitors that don't suck - still bite, and so are prices for rigs that are required to produce a stable output, also you would run into stability issues if you run different framerate monitors on windows.
The shitty NEETs are way ahead of you goodgoy consumers.
>Anyone who doesnt want to buy into the latest scheme to fool retards out of their cash is poor
>Complains about elitist
You're the worst shill I've seen all year.
Oi, but then you have to then pay for the loicense.
I make more money in a month than you and your entire family make in a year.
t. navy officer
4K looks gorgeous but you need a monster PC to run any graphically high quality game at decent framefrates. 8K is about 4x worse in terms of performance and almost impossible to see a visual improvement over 4K unless your monitor is over 30”. I’d take better models, textures, lighting and scene complexity before a jump to 8K.
What the fuck did you just fucking say about me, you little bitch? I’ll have you know I graduated top of my class in the Navy Seals, and I’ve been involved in numerous secret raids on Al-Quaeda, and I have over 300 confirmed kills. I am trained in gorilla warfare and I’m the top sniper in the entire US armed forces. You are nothing to me but just another target. I will wipe you the fuck out with precision the likes of which has never been seen before on this Earth, mark my fucking words. You think you can get away with saying that shit to me over the Internet? Think again, fucker. As we speak I am contacting my secret network of spies across the USA and your IP is being traced right now so you better prepare for the storm, maggot. The storm that wipes out the pathetic little thing you call your life. You’re fucking dead, kid. I can be anywhere, anytime, and I can kill you in over seven hundred ways, and that’s just with my bare hands. Not only am I extensively trained in unarmed combat, but I have access to the entire arsenal of the United States Marine Corps and I will use it to its full extent to wipe your miserable ass off the face of the continent, you little shit. If only you could have known what unholy retribution your little “clever” comment was about to bring down upon you, maybe you would have held your fucking tongue. But you couldn’t, you didn’t, and now you’re paying the price, you goddamn idiot. I will shit fury all over you and you will drown in it. You’re fucking dead, kiddo.
>16:9 is the industry standard.
Legitimately makes me want to hurl, it's fucking unusable for anything except watching fucking movies.
Sorry there, guv'nah, I thought I was on fokin' /v/, not /tv/. Gonna 'av ta ask ye fer yer tv loicense now.
Artstyle > graphics
Graphics are pointless if the draw distance is piss poor and trees pop up at 100 yards away. If it has a simple artstyle and good draw distance, it's way better than muh graphics.
1080p still looks fine. And the games that focus on >muh grafix usually aren't fun to play anyway. I'd rather play PS1 games than modern AAA games.
Draw distance is part of graphics, stupid. "Graphics" doesn't just mean "polygon count".
That shit you call artstyle is hipster fag garbage that was dated 10 years ago
Except if you were being honest you'd acknowledge Graphics (Speaking of technical detail) and Art Direction are generally referred to as two distinct things.
You incomprehensible nigger.
>Everything is a scam because I say so
>No YOU'RE the shill
I started experimenting with a CRT as a secondary display and everything looks amazing on it. 1024x768 is beautiful, as is 1600x1200. The smoothness of motion, clarity in motion, the slight softening due to the focus of the electron gun provides a costless anti-aliasing solution. The solution isn't higher resolution, there's barely even detail to resolve at 1440p let alone 4K. The solution is displays that can actually support more than one resolution.
This isn't 1995 anymore you nigger jew
>pushing the 8k meme when consolegarbage can't even run 4k smoothly and an >60FPS 8k ready desktops are top percentile hobbyist rigs territory.
also fuck E3 has brought the worst of marketing and cuckchan here, this thread is full of "WHAT ARE YOU POOR, GO OUT AND BUY THE [NEXT "BIG" THING] IMMEDIATELY WHAT ARE YOU WAITING FOR GO GO GO CONSUME BUY SPEND CATTLE."
"Technical detail" is a meaningless term. You'd have to define what you mean by that. Any reasonable definition would include draw distance and LOD scaling.
most movies are still mastered in 2k, fucking Endgame as high budget movie as you can get was mastered in 2k.
4k still isn't ready yet for movies or games. most of those "4k blurays" you get are just software upscale put on a disk. games can barelyr un on that shit even on the "flag ship" cards.
Life is a fucking pixel.
So far all I see is those faggots who get 3 monitors and "gamer" merch as the only people with real demand for this shit.
Some old games don't have scaled pixel elements. So if you want to play those games you have to lower your resolution to what it was designed for or you won't even be able to read shit. Other games simply just break past a certain resolution because you have to play with things you shouldn't to get it to work instead of just biting the bullet and stretching a lower resolution to fullscreen. A bunch of emulators are more CPU intensive when you upscale its internal resolution, so in many cases it is actually better to just stretch a lower resolution. Heck the oldest emulators didn't even increase the internal resolution the way people thinks it does, it's just 2x interpolation so you really were just better off keeping it at the lowest resolution your monitor supports for best performance, only morons went for higher resolutions at the time.
4k is fucking great. I still haven't upgraded being a poorfag, but I want it. 8k on the other hand might actually be a meme. There's an actual quality difference between 1080p and 4k. 4k to 8k is just size difference, really. Also, the human eye can't see more than 5k anyways.
No. Every single time that you hear about a "demand", it is a (((marketing))) ploy. The (((marketing))) mafia are the ones demanding more money and to keep being paid, so they artificially create propaganda to pretend that there are people interested in any new useless gimmick that the industry come up with.
Whenever there is (((marketing))) pushing for something, that something is literally not needed by anyone, anywhere.
16:9 is already a meme.
Civilians are fucking stupid. LMAO imagine having to wonder where the money you pay your insignificant bills with is going to come from. Pathetic. Now pony up my paycheck, you taxpayer bitches. I've got shit to buy on Amazon.
'Demand' is fabricated so companies can sell you shittier products, the same with chinkshit.
there isn't even a demand for 4k
4k is useless for anything more than a monitor. The TV companies are doing like digital cameras did. Instead of increasing image quality, they are just increasing pixels. I rather have a 1080p plasma with good contrast ratio than some shitty LCD with 8k.
They want to get people to upgrade to shitty quality 8k screens then come out with OLED 8k to get people to buy twice instead of once nice 4k screen. And it will work on a lot of people.
>I just got a 1440p 144hz 27in
I have two 24in screens myself. 1440p is the perfect resolution with enough fidelity that you don’t have to run god awful Anti Aliasing in games.
>The AI is greentexting now
>Not even a line officer
How does it feel being viewed as a POG even by other POGs in a POG branch?
One of the downsides of anon-posting is machine-learning fuckery compared to other platforms, although there are definitely benefits.
You pay it all back in the end anyway, when you raise someone elses kid after your wife cucks you while you're on deployment.
I have honestly never seen 4k picture. My parents have one but they have no 4k source going to it. I've been using the same 1080p hdtv for many years. I might as well miss 4k entirely and get my pants blown off by 8k.
>is there an actual demand
Of course there's a demand, there's always some fag who wants MOAR PIXELS. The question is if there's enough of them to actually give a fuck about, or if 8K has just become the new buzzword for COOL NEW TECHNOLOGIES AND SHIT that some fag at the console companies mentioned to their boss.
I don't really give a shit about 8K, I'm going to not pay any attention at all to it until it becomes cheap enough that I'll get it for shit-all, or the world burns to the ground.
Too bad you're spending most of that on HIV meds.
We've had bots posting here for some time, anon. Never seen them?
Pointless, current top of the line GPU barely do 4k at an acceptable level, making hte workload 4x heavier is obviously not gonna make that better.
>I didn't even know what 4K is supposed to mean until I saw OP's image.
it's pretty simple, 720p is an image 720 pixels high, 1080p is an image 1080 pixels high, and 4k is an image 2160 pixels high (because it's a bullshit marketing term)
The cynic in me thinks the push for ever increasing resolution well beyond what hardware is reasonably capable of running is merely to inflate file sizes, in an effort to curb piracy. There will be some that jump on the memewagon, for sure, but for everyone else a game that installs for 45gb this year will be 200gb next because muh 8k textures ostensibly.
It’s because it is supposed to have 4 times the pixels as 1080p and 4X didn’t sound as good I guess
I very much doubt pirates won't just downsize the textures and share full HD versions of the game instead. If they want to inflate file sizes, they're going to have to also consider how many people actually have the space for their games, which may end up costing sales too or driving people to aforementioned pirated downsized versions.
E3 has certainly brought in you console cucks thats for sure. Of course it's always the console baby fags who want to keep technology back instead of moving forward because console fags still can't get 60fps because they're retarded
I had to do it.
Wait I just realized.
>Didn't copy the drop down arrow
>Generated version has it
>Goes off on its own about AI trying to change people to be "good"
It's just missing a post number.
I'm sure it has nothing to do with games being 23 FPS slideshows on consoles at even pseudo-4k scaled resolutions.
I haven't even interest in 4K for my media or games, why would i already be showing any interest in 8K? 4K is still a niche for content, most of the shit that advertises as 4K is upscaled from a lower res.
The only thing i can imagine doing at 8K right now is browsing through professional, very high quality pictures taken with the best cameras available, and i don't think anybody is going to be shooting that kind of porn in 8K anytime soon.
I got a 4K 50" tv for $300 and I can't tell the difference between 4k content and 1080p content. I can see it if I get real close like a retard, but from a comfortable distance, 1080p is enough.
The only use for ultra-high resolutions is VR headsets.
Or for monitors. You're simply too far away from a TV to notice, but a monitor is close enough to notice things like aliasing.
Only if it's made with 4k in mind. Unless you like niggerball there is very little to actually watch in true 4k, it's all upscaled trash. In the case of games you better have a beefy rig or you won't be able to get a decent framerate to begin with unless it's an older game which won't really benefit.
>unless it's an older game which won't really benefit
Older games benefit greatly as many don't have proper AA solutions.
Upscaling does nothing to fix jaggies, you would have better luck with supersampling and that can be done at lower base resolutions just as well.
>Is there an actual demand for 8K quality?
If you want real photorealistic graphics (not """photorealistic""").
Of course everything they do is wrong.
First: It should be curved encircling display with at least 120 degrees of horizontal FOV and 100 pixels per degree density. Not some tiny (by FOV) 40-50 degrees FOV TV screen on the wall.
Second. Obviously their hardware would ne be able to run native 8K resolution graphics, only menus and upscales.
Aliasing is completely different matter. Eye Vernier's resolution is x10 better and aliasing (shimmer of moving straight lines) can be seen 10 times further away.
> No, the problem with ISPs in US will not change for quite a while
Elon Musk space interwebz would save burgers and world.
Stop upscaling your games and run them in a native resolution that matches your monitor. If that doesn't work then use your graphics card to force anti-aliasing. There problem solved, if it still doesn't work that is usually because you got rused into thinking it is true 3D instead of a sprite or something.
LCD monitors only have one native resolution.
If you ask me, 4K was useless.
Never saw the need, and never saw a difference
More space and detail than 1920x1080, but not too big to fuck up pixelart or old UIs or tank your framerate or anything. Bigger resolutions are fine but it takes time for everything to adapt to it.
You just proved yourself wrong. There's a reason that FOV graph is split into different sections.
I'm holding out for 1M.
1,000,000 x 600,000
Anything lower than that is blocky, ugly rubbish. I have 4K on my 2" screen flip-phone and it makes me physically vomit every time I have to look at the stone-age piece of shit lego-screen.
For the home computer that I just use for checking twitter on once or twice a week, I have 140 10" 5K plasma monitors linked together to emulate some kind of half decent overall resolution, but I can only use it for 2-3 minutes at a stretch before i start involuntarily slamming my face into the desk and keyboard until my eyes are bleeding.
>he was so close, but fell into a retarded heap of stupidity
320x240 was fine, anything higher than that and the game looks like worm-filled dogshit.
Back then your imagination filled in some of the blanks, when 640x480 came, everything started to look like plastic crap.
They are written there.
>320x240 was fine
I bet you get fucked by your sailors, faggot.
>quality so bad that even at 240 it looks like an upscaled heavily compressed video
Good job you nigger.
>Is there an actual demand for 8K quality?
So long as there is demand for photorealistic graphics, yes. Good news is that the human eye won't be able to tell the difference between 12k resolution and real life so the end of the HD wars will happen in our lifetime.
>the end of money grubbing marketing campaigns for useless shit will happen
>People will buy something they won't be able to tell the difference between
Just like Bluray, eh?
I know it's a shitpost, but I legit played the game Outward with an internal resolution like this to get a playable framerate(my video card is dead).
It was a pain.
It was pretty much playable, but but the problem was when there was stuff far away from the character/camera, they were just a few colored pixels. It was tough identifiyng what gear bandits has until they were already fighting you.
Not everything is a linear relationship you fucking autist
Does it look proportionally better to the performance and price increase?
motherfucker there isnt even a demand for 4k let alone higher resolutions, if you want half your wall to be a tv you are spoiled child
No there isn't. Things have limits. Will there be a desire for atom sized pixels?
This is the only comment with any sense. Things won't keep getting better. If you can't distinguish between the pixels anymore then there's no need for higher fidelity.
They'll just fnd a way to upgrade our eyes anon, don't worry.
>Most everything these days has proper HIDPI scaling.
If all you use are mainstream phone OSes and macOS, yes. Otherwise, hell no!
My eyesight isn't even good enough to notice 4K let alone 8K
I have a 43inch 4k monitor, its definitely worth it for my particular setup, but I'm mainly using it for work. I tend to drop down to 1080p @75hz for a smoother gaming experience on my aging 770GTX.
If you just want to play games on a regular sized monitor I'd say go for 1440p @120hz+, it is noticeably sharper than 1080p but you can still get over 60fps with a mid range card in most games.
One feature that often gets overlooked is HDR. While a couple of implementations have been poor, it can make a huge difference in image quality (Forza Horizon 4 being probably the best example), and I'm genuinely looking forward to HDR becoming standard.
8k won't be worth it for at least a decade.
>Now that both Sony and Microsoft have announced that their new consoles will support 8K quality
>implying CY+4 PCs can even do 8K at more than 20 fps, let alone stable 4K60 without overclocked NASA tier rigs
>implying "8K" on 9th gen consoles won't just be upscaled 1080p25 using meme filters in yuv420p assuming the Soystation 5 and Xbox one 2 XX can even render every game at 1080p natively without dying.
>4k isn't needed
I'm very happy with my 4k monitor since I get the benefit of gaming on a large monitor, 26 inch, with a high pixel density, so it doesn't look like shit, like it would with 1080p.
It's not an AI, it's a neural network.
There isn't and has never been an AI created by humans, an artificial intelligence is counter to natural intelligence. It is a form of consciousness created outside of organic life.
This isn't me being pedantic, these are two completely different things and should not be used interchangeably. I don't give a shit what the public thinks.
240p looked fine, I don't see the point in chasing higher resolutions. It's not like the games are any better.
Spending most of your money on medicine lulxdamericucks
Greentext existed long before imageboards did, greenhorn.
Of course. I want to look at loli pantsu in as high a resolution as possible.
nothing even makes use of 4k monitors properly yet
Niggers, >>16587173 is right. Even your shitty $200-300 tvs 40 inches or bigger are all 4k. It's not becoming the standard, it IS the standard. So of course the shitheels at Microsoft and Sony are targeting both it and beyond. Burger 4k tv saturation Is around 31% of households and its expected to reach 50% by the end of the year. 8k as-of-now is a "look at how expensive my tv is" thing. But even then prices on those are starting to range from 5-10k for 65-82" 8k tvs. Those $70k tvs are for car dealerships or other businesses with shekels to drop. It makes complete sense for these companies to target the fastest growing tv technologies and it means there's a chance these consoles might not be hideously underbaked come next year.
Even if that is true source your claims shill almost nothing is actually in real 4k. At best it's all upscaled.
>Burger 4k tv saturation Is around 31% of households
I highly doubt that. Most people don't buy a new TV every year or two. And the adoption is much lower than that for PCs, even though that is more enthusiast technology. >>16587943
real gamers will stick with 1080p for a long time, video cards havent gotten any faster in almost 10 years
Studies say 360p TVs are getting popular and already exists in 16% of all households, projected to rise to around 40% by christmas. There's an abundance of second hand 1080p and 4k monitors because people are abandoning it in doves.
>t. tv salesman
Some relevant sources and reading material.
You also have to consider all the people that haven't bought TVs in the last 2 years who are replacing their older televisions. The adoption rate only increases over time due to most/the only products available being 4k. 4k on PC is still an enthusiast level luxury which is why you still see so many entry level monitors being 1080p. That's still a different market from televisions.
When OP says "actual demand" he means that people who demand them must not only be willing to buy it but also have the means to afford it.
I heard that 8k monitors cost 5 grand so I can say for myself that I can't afford it. Plus, if I were to buy it: my older games and movies would probably look like shit on it and there's hardly anything new I'm looking forward to so I wouldn't buy one if I had to means to afford it.
>TV salesman actually provides sources
Mainstream demand? Of course not. High-end demand? Certainly. There is a whole segment of society that buys the latest, best thing simply because they can and because it's impressive, even if the improvement is marginal. The same types of people who buy a $100k car to drive around down will buy an 8K TV to put in their living room and brag to their friends over. In turn, those sales will go back into the manufacturing process, making 8K more affordable, until it eventually becomes standard. This is how the entire TV market has worked for decades.
I really hope the parents or doctor of Jazz gets charged with something when he ends up killing himself
Lmfao, /v/ has never been this ass blasted before.
The 4k vidya meme is being pushed because graphics have largely stagnated and there's only so much heavy post-processing they can slap on games to shit up the performance and necessitate the purchase of new hardware. But wait, couldn't you just get a nice 4k monitor and use 1080p with integer scaling for really performance intensive vidya? Hahah, no. Nvidia and AMD have colluded to make sure you can't integer scale which gives a crystal clear image, instead you're forced to use awful blurry scaling, because they NEED you to run at native 4k to buy that hardware.
Even your 4k meme consoles don't do 4k all or even most of the time, they all use dynamic resolutions to try and hit framerate targets. That "4k 120fps" shit getting pushed at e3 is never going to happen.
>because graphics have largely stagnated
Yes and no, maybe with consoles sure, but PC's are still getting better and better. The only reason people think it's stagnating is because of the demand for "retro" pixel shit and "goofy" graphics that look like something out of the first Toy Story from 1995. The Tech is all there but few are willing to take the time to use it because time is money and money is money and money is time which is also money
>but PC's are still getting better and better
Sure, but where is that power going? Compare a maxed out crysis 3 (2013) to whatever the CY+4 wolfenstein or far cry games are pushing. Obviously crysis 3 was an outlier for the time, but that's what I mean, we had that fidelity 6 years ago. What have we really gotten since then besides additional disgusting post-processing?
We finally caught up to that level of fidelity, and instead of pushing the envelope with higher detailed assets, oh let's just bump the resolution and slather it in chromatic aberration.
I like chromatic aberration; rainbows are nice to look at.
Basically this. My biggest problem with 4k is the need for scaling, since I don't want a monitor over 27". This means images/UI that doesn't scale well would either be stretched (blurry) or too small.
I love posts like these.
>not bloating filesize while heavily degrading overall quality
You'll never make it in the industry, anon.
>using a television for vidya
enjoy your 50ms input lag retard
HookTube embed. Click on thumbnail to play.
>50ms input lag
In 30fps titles you can break into 200+ms.
what's the source on this?
RIS turns up nothing.
I think the ideal monitor size should be directly proportionate with your desk size, something like for every 3 square inch of desk you should have 1 square inch of monitor
Now they should show the fps with raytracing enabled. :^)
Is there actual vector monitor technology that could be used for displaying all the shit your generic LCD/LED displays?
I just don't see how a vector monitor could do all of that (rendering gradiants, filled shapes, tons of colors and color combinations, etc).
>Tries to be a smart ass PC autis
Hey dumb ass, on PCs 320x240 is known as QVGA or if you want to get real autistic CGA.
I want 8K, but only because 8K is THE final resolution. Once you get to 8K, there is no purpose for a higher resolution in consumer products. 8K will be the peak and then finally, we'll return to acceptable performance being the norm.
I want to accelerate the resolution race so that we can end the resolution race.
I love hearing this as justification for buying the latest tech trend
I bet you really believe it too, don't you?
You don't realize that right alongside that 'tech race' are buzzwords and hype to sell new expensive shit to gullible idiots every year
I'll see you again at 16k, real excited
To fully cover human FOV at eye acuity angular resolution you need like 16K. At vernier acuity (no aliasing artifacts). you need 160K. So final end of teh resolution race is in the 32-64K with good antialiasing.
>To fully cover human FOV at eye acuity angular resolution you need like 16K
That has to do with aspect ratio, not resolution, retard.
Recommend me a television, Mr. Salesman.
only if you like burn in, LCDs are better for both input latency & longevity.
Lets try and get 1080p60 as a standard before we get any grandiose ideas about even having 4K games.
No you won't, because there's no point beyond 8k. The only way I'd buy a TV with greater than 8K resolution is if I owned an 8K TV, it died and they don't make 8K TVs anymoe.
>Sony is currently selling an 8K TV for $70.000 a piece, which honestly seems outrageous.
Oh yeah, remember when they also introduced their first 4K TV? It was also priced outrageously if I remembered it correctly. I guess it's their marketing department wanting something with outrageous numbers that noone else have so they can put an outrageous pricetag on it.
Currently I'd say the market isn't really there, maybe starting in 2-4 years we'll see blips of it. That's considering Microsoft and Sony's statements on their next gen consoles, apparently, having ability to play at 8K. I don't know how that is going to work out, since the PS4 and Xbone are not that great at running things at 4K.
So I am of the expectation that they'll probably run things, adequately at 4K, but will chug pretty bad at 8K. Unless some sort of hardware revolution happens, which I very much doubt.
Honestly, until other hardware gets to pace with running at 8K at a decent level of performance for gaming, which will eventually happen, it's probably not really worth thinking about until then. Though 8K will probably be great for watching movies and stuff for the time being for those who can afford it.
I don't think so.
> That "4k 120fps" shit getting pushed at e3 is never going to happen.
Not in teh next generation but ultimately with foveated rendering its possible. Of course its would take much hard work to achieve such.
Dumb ignoramuses don't realize that tick rate and refresh rates for the monitor would actually make their game consoles feel next gen. these are simple infrastructure/hardware upgrades. Dunno why the soyim are so riveted by extra pixels.
because MUH 4/8K usually comes with a bigger screen, every brainlet gets that. higher fps are not as obvious and consoles operate at the limit already anyway so upscaling is less of an issue.
I would love for flatron for hdmi but there isnt any around
It's marketing and for dumbass normalfags who brains shut down from hearing "2160p". The names 4K and Ultra HD were chosen because it's easier for the dumbass clerk at best buy to explain to nromalfags what it is.
YouTube embed. Click thumbnail to play.
Not really. Down-sampling is more useful as it can be used on any monitor, act as an aa substitute.
>need a monster pc
I'm driving 60fps 4k on a gtx 1070 and an i7 4770k
This thread is full of people who are completely fucking out of touch holy shit
is that a screenshot of DE:HR
at this moment 4K is the new salesman buzzword to sell TVs to BFU because nobody bought 3D shit 5 or so years ago
also there is almost no content in 4K and the little content there is is mostly upscaled 1080p
almost useless for gaming - the hardware just isnt there
give me 1080p at 120fps rather that 4K at 30 fps
in video resolution is also the most useless parameter, bitrate (and therefore file size) is way more important
try downloading 2GB rip in 1080p and 10GB in 720p and see the difference
Frame rate and image quality are far more important than resolution.
It looked pretty good in Dying Light.
OP why did you post a image where all of the examples are of the same quality? Also who gives a fuck about 8k, I'm sure it'll be fake anyway
the LG C9 has the lowest input latency you can get on a non CRT, as well as the best picture. Burn in is almost a non issue in modern OLEDs
>This fag only buys cheap TVs with >30ms response times
>He has the balls to call anyone else a retard
It’s straightforward and easy to determine the resolution needed for the application. 4K is already beyond the limits of the human eye. If you’re part eagle you might want more resolution. If you sit real close to your tv you might want more resolution. If you put your nose inches from your monitor you might be able to perceive the difference between 4k and 8k, but no one watches tv like that, and (relatively) no one wants to pay for the technology and bandwidth to support it.
Roadside billboards are printed at 5-10 dots per inch and no one ever complains about how pixely they are.
If you are looking at a 30” monitor that’s two feet from your eyes you might be able to tell the difference but you’re lying to yourself if you think the perceived image and motion are worth the extra expense.
There’s a reason consumer cameras keep going up and up in megapixels and the $5000+ professional models are still all around 10-20.
The same thing happens with audio. Nobody gives a shit anymore if their sound card is 192kHz 24-bit or the same 16-bit sound blaster grade we had in the 90s because nobody can tell the difference.
I went back from 4k to 1600x1200. Seriously. Yeah fonts were blurrier at first and high res stuff looks prettier on the HiDPI 4k I have but it's overrated, especially considering the additional hardware requirements and also especially often the poor support in software for such high resolutions/DPI. Don't even miss it anymore. 16:9 is garbage and I was genuinely surprised how many problems this step solved in the overall usability department. One thing I also noticed was how unhealthy close I was sitting to the 4k screen.
It also has to be said that we humans don't have two perfect cameras in our head that record every frame perfectly, but that it's all open to interpretation from our brains anyways. If you get used to 1600x1200 then 1600x1200 is enough for you. I used to use computers with a resolution of 320x200 and that used to be enough, too. For me 1600x1200 is the sweet spot.
Why the fuck would I want to play a game in their highest stats or whatever? I can understand a film but then I would never get it digitally if I could there's no point in upgrading.
Still using 1280x1024 in 2019. 4K requires a lot of media bandwidth and costly gpus to play, w10 scaling is shit even on 1920x1080. 1920x1200 is the best.
And watch your MPEG2/4 TV in 90s resolution, or shittily compressed "4K" youtube video.
might as well. Literally not worth the effort,cost,or problems involved
It is useful, in some applications, Excel and stuff.
For VR, not monitors or TVs.
It does not curb piracy, it just removes insentive to buy those 50Gb titles at all.
For a TV, motion is more important than resolution. Looking forward to VRR becoming standard. But as for resolutions, 1080p is adequate, 4K is great, and 8K is more than most people can perceive from typical viewing distances (ie perfect).
Ace Combat 7 is 46GB, without cutscenes on piratebay it's just 16GB which means 30GB are cutscenes.
This means video content, AKA THE MOVIE SHIT WE WANTED TO LEAVE BEHIND, is two thirds of the game.
And this isn't the only game. You can do pretty good looking things in realtime these days, it's just the target audience and the publishers wanting MOVIE MOVIE MOVIE TV TV TV XBOX XBOX
It has nothing to do with piracy, especially when it's faster than PSN or XBONE Store or Steam
What I hate most is that they defy the medium.
You could have interactive cutscenes as part of your game but muh let's throw some prerendered pixels at the screen.
I still prefer "4K" because it at least refers to something quantifiable on the resolution, namely the width. I remember when cameras used to use the term "megapixels," fuck that shit with a rusty fork. I'm not going to learn a whole new baby scale for resolutions.
>not counting the sideline he makes allowing the filipinos to hustle meth
I bet someone would still manage to shit up their desktop at 8K.
Goddamn, can hardly read the term.
Honestly. I don't even watch movies any more and I'm in my mid 20s. Like I've never been serious about the damn things but I can't imagine me actually needing a 4K TV. I do like my blacks to be black though but at the end of the day it's still gonna be a shitty movie.
the newest compile of pcsx2 has an option for 8k and even 12k or 16k but my memory is foggy
same. i know many with a 4k tv, but no useful 4k images that make me want one.
>hey anon i got a 4k tv, want to see it in action?
>loads up xbox1x or ps4pro
>loads youtube app
>searches 4k 60fps
>finds concert with that in the title
>manually selects 4k60fps
>plays a live concert of some band
>yea man you gotta get one its awesome
i think i'll pass. right now i play games on 1920x1200 60hz 24". im going to upgrade to 1440p 144hz 27". no reason for a monitor to be bigger or higher res
for bigger displays i may get 60" 4k when i get a 2nd tv but my current 1080p120hz 52"sony tv from 10 years ago is fine. i dont watch tv and its fine for movies. i own like 6 blu rays. most are dvds and vhs tapes. but i rarely watch those either.
the big tvs are mostly used by my kid to watch paw patrol or curious george downloads.
I sure as fuck don't want 8k. On PC it just takes up extra resources needlessly, and for my other games i just use a CRT because they're almost all 480i, if that.
I'm playing FF10-2 and having fun. And that game is from 2001 I think.
8k 4k I just want to play games.
People have 4k TVs because for 2 years thats pretty much the only screens that are sold.
Its much cheaper for factories to have their entire production line switch to 4k panels.
I think of getting a 4k, but that's more because my 1080p tv that i used for my montior died. The 4k tv I have my eyes on has less than 15ms of input lag so I don't really see an issue with it.
The demand is on the part of the publishers who are trying to fabricate a generated want in consumers, because they have nothing else left to interest anyone with.
Do you remember when everyone was keen on getting the latest GPU, overclocking, watercooling, SLI, etc. because it would let them play new games on the highest settings? Do you remember when there was a night-and-day difference between 'very high' and 'medium' which people were willing to pay hundreds of dollars for?
This is now the arms race which dominates the thinking in the industry. Normies will not accept a game which doesn't look as good as the last one they played. Looks matter more than gameplay and so that's what's being cranked up. Why are there so many cutscenes in AAA games today? Because it's much easier and cheaper to make a cutscene look good, where you have total control over the camera angles, lighting, animations, etc., than it is to make freeform gameplay look good, where you have to account for all the places the player might put himself and all the actions he might take.
The hardware industry caught on quickly and churned out more and more components. But they've now far outstripped the capacity of the developers to make games which actually tax the top-end hardware like they used to. As a result the latest GPUs are emperors without clothes - overpriced and underutilised. In an effort to maintain the illusion they've added gimmicks like Hairworks and RTX, which are hollow attempts to recapture the excitement of getting a new GPU in the mid- to late- 2000s.
Either we are truly reaching diminishing returns in graphical fidelity or no one cares enough to innovate anymore. You've all seen those images comparing Crysis to games released within the last two years. 8K is an excuse to do something with all this hardware capacity overhead we have because demand hasn't kepot up with supply but no one wants to admit it. The real money in hardware is shifting away from video games anyway, since all the industries adopting machine learning techniques are buying up far more top-line GPUs than bankrupt basement-dwelling video game enthusiasts could ever afford.
Widescreen as been a stupid meme since its first introduction. Congratulations on falling for the scam.
Your second picture is 3 different sizes you retard
>Uses YouTube to show off new TV
God I hate normalfags. A 1080p bluray looks much better then any 4k streaming shit. If you wanna show off a 4k TV get a properly mastered 4k bluray.
the 4k DSR (Dynamic Super Resolution) setting on my 1060 works better then any kind of AA (no shimmering during movement) and I get a good 75 to 90fps in most tittles at 4k (no overclock, just power target at 100%) and i'm still using a very nice 1080p screen.
Personally, I can hardly tell the difference between 720 and 1080. I can't imagine anyone can really tell the difference between 4 and 8.
Don't forget that most of these TVs do "after-sales monetisation" on you ie.e they spy so they can sell the data Faceberg/Google style.
CTS can fit more people inside to spy on you though.
<just to play modern button-press movies
<classic old games will look shittier and shittier on these things
Nah, you can keep it
Good for you, is that 1080p screen 42 inches? My current tv/monitor has died, you call me a pleb all you want because of input lag, but I have grown accustomed to a 42 inch 1080p screen. Besides the said 4k tv can scale back the resolution to 1080p, the main reason im siding with 4k is that the 1080p model in that size of it is only 40 buck difference.
Is it even possible to buy a non-smart TV these days? My years old 1080p TV has all that streaming garbage installed and there is no way to get rid of it.
Is there literally any reason to go for the 4k monitor meme when I can just buy a super good IPS HDR 1080p monitor for a fraction of the price?
>Good for you, is that 1080p screen 42 inches?
No, its a 40-inch 1080p samsung flat panel IPS display from 2010. I use its game mode setting and that has about a 30ms input lag. it still has a great picture and looks great with (downscaled) 4k content. i'm not upgrading until the market calms down a bit and the HDR format war gets solved. also I'm not happy with the recent performance and life expectancy of OLEDs. i'm hoping Micro LEDs mixed with quantum dots are available at a good price when my screen finally dies. (I don't want to pay more then $500 for a TV)
Yeah, that is why you need to buy much larger diagonal for 16:9 than for 4:3 to get the same height.
16:10 is nice compromise, but it costs too much for my taste.
I wish 16:10 monitors were still a thing. that was a good compromise.
Close enough, my was a sony tv I got from a pawn shop back in 2012, 200 or 300 bucks, I forget. I don't really care about the HDR shit. If my tv wasn't shitting itself I would be waiting it out just like you, but instead HDR formats i'm waiting for the 120 hz tvs to go down in price. And not that bullshit 120 hz, true 120 hz.
4k is already entering diminishing returns, and anything above it only gets those harder. Image sharpness gain basically drops off instantly as you're already getting almost non-existent grain at 4k.
The other issue is that most 4k+ proponents are also pushing for ultrawide which is frankly dogshit that either lacks vertical FOV or has too much horizontal FOV, meaning you have to move your head and eyes around much more to see items in your peripheral vision which can break concentration and is frankly just inconvenient. As well for fitting more shit - multiple screens are a better solution for the non-existent benefit of "it gives more space" that some proponents bring up.
you are correct in waiting. its pointless to buy any display, console or GPU without full support for the HDMI 2.1 spec. variable refresh rate, better vsync mods and built in low latency as a standard is what we've needed for over 20 years.
4k60hz+ content makes sense for games. movies are mostly fine with 1080p. 8k60hz+vrr also makes sense for game content but mostly for retro screen emulation. and large format displays. most (if not all) 8k screens right now are a waste of time as they are 30hz only and use 4:2:2 (or less) compression making red colors fucked up and muddy. until 8k displays with full support for the HDMI 2.1 spc and (full 4:4:4 processing) come out, its just an overpriced gimmick that will look worse with most content then native 4k or 1k displays.
I genuinely like that interface
You missed the "if my tv wasn't shitting itself" part. My tv is broken, im not going to wait. I'm going to buy one of these:
I know it's a cheap ass tv, but my old tv was a cheap ass tv and it did the job for me, and since it's got a better input lag than my old one I would consider it an improvement.
The televisions often also have worse secondary properties such as contrast and response time, making them incapable of displaying a more detailed image than lower resolution screens, while requiring more hardware to drive them in case of a computer.
It's mostly a meme to sell screens to normalfags who don't know anything. Just like those 10k+ screens that can't even compare to late sony CRTs bought used.
Right, the TCL TVs are a fine low cost solution TV/Monitor to buy right now. perfect for gaming they have great upscaling tech for such a price point. if I was in the market for a TV I would buy a TCL (until better technology was available for a good price).
Ironically it's the rebadged chinkshit Smart-TVs that lack these kind of "features" cause they're barebones and not really expected to recieve software updates.
Just look for one using a semi-decent panel.
The 4 series TCL is dogshit for any sort of picture quality. The 5 series is almost as bad but the chunk fucks made it so thin it literally will burn itself out from overheating. The 6 series has so far been the only actually decent value proposition tv from that company with a picture and build quality that doesn't eat absolute ass.
What >>16627607 said. The absolute worst tvs on the shelves are the ones without any form of smart feature. You're about as likely to find one with a good panel as EA is likely to drop microtransactions. I'd actually recommend a monitor instead if you truly cannot have even a single smart feature with your panel.
Got a 1440p 144hz screen and it works out very well for me. It's perfect video editing. I honestly don't see how higher res would hurt, that just means you can fit more onto your screen.
>ITT: people who think being a luddite and a malcontent is fashionable
What do you guys have against better quality existing? Why is making shit better a "meme?" Were cars a meme when they replaced horses? Do you all refuse to upgrade from VHS because DVD is a meme? And then there are the ones who swear that the best entertainment experience is using a 4:3 CRT television from 1985 with a coaxial cable. Obsolete technologies become obsolete for a reason. These opinions look like they're based on a belief that literally everything that happens now is horrible and you want to freeze time.
It's one thing to be miserable and angry because of unfair life experiences, but what I gather from you lot these days is that you're miserable and angry for the sake of misery and anger. The world is always changing, but it's not always for the worse. Do yourselves a favor and go outside for an hour or two, or at least get off this site and turn off the news. Call your parents and siblings. Go to the park. Get a burger or a bowl of noodles. Stop fixating on girls who date black guys or what you think the Jews are up to, those have literally always existed. There are too many people in the world for any one person to affect your life that much. Do something that makes you feel good instead of bad. There's actually a lot of good in the world, and you'll find it if you look for it. When you spend all your time in an echo chamber full of people who hate everything and everyone, it's easy to start thinking the world is irredeemable.
4k and 8k would be cool for the home movie theatre audience. like a 200" or 400" tv in 4k or 8k instead of a projector. even 1080p at that size would be a huge improvement.
what do you guys think of gsync? it looks better than freesync but theyre expensive.
i saw a 1440p 140hz monitor with gsync for 400. ones without are like 300.
Dell still makes good 16:10 monitors.
Read the thread. No one is angry simply because this stuff exists. What pisses people off is that it gets hyped and then chased after, meaning we end up with shit games with fantastic graphics because all the development resources went into making things look good for these new and incredibly demanding specifications.
It is a meme because of many requirements, for gaming: very expensive gpu, for storage - expensive HDDs with a lot of space, if streaming - good ISP without data cap, good hidpi support, high quality streams required, shit media quality.
(((They))) want to push 4K because it drives costs up, so you can milk dumb consumers more, everyone wins, ISP charges more, you need to buy more HDDs, or subscribe to expensive streaming option, blabla.
diminishing returns, faggot.
youre equating every sales release as revolutionary like niggers do. this could be another beta max, another webtv, or another 3d tv. were those better? were the people that didnt flush their money away like niggers resisting progress?
4k isnt noticeable on displays under 52", and even at 52" you can only notice 2 feet from the tv. 4k on a 70" tv would be worth it for big movie buffs, but theyre selling 4k gaming which requires a huge performance drop in exchange for minimal improvements, and virtually no improvements on small monitor sized displays.
the "next big thing" isnt this just because it was marketed that way you stupid slave.
16:10 is like 1920x1200, right? i have an hp monitor at that size. shame it maxes at 60hz.
I'm already seeing commercials for 8K tvs. Progress is inevitable.
Never, despite being superior.
>oy vey goyim number is bigger that means progress ha ha big number is good wow lol
Shut up and get out. You’re not intelligent enough to be here.
This is a paid shill spamming across multiple boards at once.
im still using a 1920x1080 the only reason im thinking of switching is because a insect managed to crawl inside the film screen and died right in the middle. Now there's a giant brown/black pixel is infesting my screen also its nasty.
Is a $100 more really worth it though.
Technological progress, yes. "Progress" that kikes use, no. It's not that it should happen, but they'll continue to phase out smaller resolutions. Like how no one produces CRTs anymore. Eventually no one will produce any tvs/monitors smaller than 4K. Some anons already have 16K monitors.
Every TV sold in a store right now is 4k. A $500 49 inch TV is 4k. Its not betamax.
Yeah I have an excessively expensive LG TV. I watch anime on it. Not my smartest budget decision.
My eyes don't see the difference between 1080 and 4K so I'm not in a rush to even go for 4K.
>t. navy officer
I make more money in a month than you and your entire family make in a year and if I cross you in the street I'm killing all of you and your dogs.
t. mafia boss
>betamax wasnt sold in every major electronics store.
you have to be atleast 18 to post here, anon.
not that that's the point, we're talking about 4k/8k gaming. the fact that the TV itself has a higher resolution is meaningless and just another dick measuring contest for companies to have with eachother. if youre thinking that because "everyone owns one and theyre cheap" that hardware will move to be able to take advantage of it you should understand around when you were born in 2004 i had a cheap standard CRT monitor that could go as high as 180hz and many others were even higher. do you think standard devices at the time were able to utilize 180hz? even now thats not the standard or even available to people with top of the line hardware most of the time.
Don't forget to take your experimental new "Sea Sickness" Tablets
What's the point if modern hardware can't even serve a fraction of the required fillrate?
oh boy, is that anything like marine "man meals"? or is it the complete opposite because its the navy?
Vitamin supplement buttplugs meant for long deployments
you cant just say "vitamin supplement buttplugs" without any kind of elaboration, anon. i wont allow it.
I've never served, I wanted to when I was in middle school but became too disillusioned with the war by the time I could join
High refresh rate was must for CRT because of its inherent flicker deficiency.
i also wanted to serve israels agenda in a foreign nation putting myself at great risk of personal injury or death when i was younger. now i just hope israel gets nuked. funny how that works, eh?
the flickering is barely noticable at 60hz. at 75hz its virtually gone. yet monitors with well over 120 were commonplace.
>his tv doesn't refresh in nanoseconds
Grandpa, 30 year olds are to young to have used betamax.
what difference does it make? the connection between your eyeballs and your brain is well over 100ms and varies. if youre going to get a monitor like that you should first get an optic nerve augment; or atleast the drug/neuroptics solution.
well fug i still think youre retarded if you think betamax wasnt at major electronics stores. it was no different than hddvd vs bluray and every store had hddvd when it first dropped. infact it was more prevalent than bluray at the start.
Son, i'm 34 and i used a betamax recorder to capture footage of my Snes and N64 games until i was like 15.
VHS was an option but of course my dad was a nerd and fell for the Beta meme so that is what we had in the house, and he had tons of shitty movies so i normally grabbed those and recorded over them.
>that feeling when you open the beta max door
>the connection between your eyeballs and your brain is well over 100ms and varies.
Nope, what you're suggesting as "latency" is actually just a neurological underpinning in which we tend to group a certain period of time. If something happens within 100ms or less, we tend to perceive it as "instant" as we regard that 100ms as a "single instance of time". This doesn't mean that we have a latency effect, it means that the processing of our brain clusters it.
The closest it gets, context given, is that our brains cap off at a "refresh rate" of around once per 13ms, or 77hz. A response time of 30ms (or more) would be horrible and very, very easy to perceive since the entire 100ms "period of time" would be processed as being visually laggy. This is why 60fps is generally considered good (close to the effective cap for information) but some people swear by 120fps (covers the small amount of remaining perception).
This 30" monitor is 16:10 and 2k. its $300 and i never heard of the company. every other monitor i can find with that aspect ratio and above 1080p is atleast 3x the price.
notice the lack of an advertised refresh rate.