1080p 10bit vs 1080p reddit. Most UHD content is 10-bit.
- 1080p 10bit vs 1080p reddit Or check it out in the app stores I ordered the 1440p 165 hz version but got the 1080p 300hz instead You are comparing 2 very different formats. 976 FPS - 24. IPS panels 1440p 144hz » LG 27GN800-B - IPS, - Reddit favorite mid-range monitor. 6CH. I think some of the streaming companies are creating these Frankenstein versions (with their original content), and I know some individuals/groups are re-encoding (with improper settings) and then releasing them. 265 file needs to be roughly double the bitrate of the 1080p h. More detailed, sharper image. 1080p was just a hd resolution, 10 bits is color depth (bits per color). List last updated July 2023. The only other reason for additional cropping would be if e-stabilization was on, but that would apply for both 1080p and 4K 10-bit options (+4K 8-bit 60p because it's a processor-hungry setting). if you have a 6700xt then get a 1440p monitor and enjoy it. AAC. Now the file size Streamers are currently doing Elden Ring in 1080p at 7-8K Bitrate and it looks fine. Old Tigole release were the best, but since around him joining Qxr the new releases are all really bloated and almost x264 1080p size. Usually, 1080p doesn’t go higher than 8 bit. If the bitrate is the same the resolution being smaller won’t make the file size smaller. One is a 4k encode, one is a 1080p encode, and they both come in x264 and x265 Most high end wide screen monitors now are simply 24" 1080p tv's (minus the tv converters) The differences come into play when it's the choice between 1080i and 1080p on a TV, and of course the size of the TV. 936p tends to work decently well at 6K, depending on what's going on it can be too much, that's why I said they should try if it works for them, but usually it works. While generally I would say 1080p is good enough, YouTube has a different equation. Or check it out in the app stores I'd say a good 4K webrip > 1080p blu Ray though Surprisingly, 1440p isn't all that hard to push compared to 1080p; 4k on the other hand is a performance nightmare unless you are fine with 60fps on solid hardware. 5GB file, and average bitrate of 6972 Personally I have not been able to tell any visible difference in 8- vs 10-bit color even if running in HDR and verifying that my TV is actually running at 10-bit 4K 120 Hz HDR. 264. There’s no difference when it comes to the actual quality (resolution) of the video. the original file is already 10bit. 709 colour space), not HDR (BT. high bit-rate 1080p is obviously the best, but over a giga byte for a single episode is a bit too much. This is also why YouTubers often upload WAY higher quality even if they're limiting to 1080-60 because the compression makes their video quality higher AT Posted by u/MarcelXOX - 4 votes and 8 comments 1080p 60fps H. I have a 4K TV. if I have several files which are 10bit hevc 1080p I would like to archive on blu ray. If i put both of these in the same folder, will plex automatically pick the best one to play on the client? In my experience, it seemed to pick the hdr version every time and almost always end up transcoding it. What is the best software and quality settings to use to make blu ray compliant h264 files? Is there a decent way to do it with ffmpeg? I have several files which are 10bit hevc 1080p webrips I would like to archive on blu ray. 5GB Set the handbrake encoder to constant Quality, put that number to 21 and let her rip. 4gb 1080p video. 264, or simply a lack of demand? h265 Mar 11, 2019 · Two different metrics. With the 1080p Blu-ray encode, I ended up with a 5. It's widely supported in hardware and gives excellent compression. 1080p: vehicle resolution (size of video) X265 HEVC: video compression format (how the video data is stored) AAC: audio compression format (how the sound was stored) 5. 0 8bit If I try 1080p 4. This remained the case up until 4:2:2 10 bit 4k content. A sub-reddit dedicated exclusively to the Hacking & Modification of the if I take a 4k disk and convert it to 1080p while compressing would I see any difference from taking a 1080p disk and compress it? using h265 in 10bit Skip to main content Open menu Open navigation Go to Reddit Home The unofficial but officially recognized Reddit community discussing the latest LinusTechTips, TechQuickie and other LinusMediaGroup content. 1080p is just the resolution, the actual quality relies on bitrate, so probably meant 1080p premium will have highest bitrate available, normal 1080p will have average bitrate for 1080p or bitrate that is calculated by your network speed. Ps file settings are the same except for the 10bit setting. 5-10 years ago, they still used to make some good 1080p TVs. Then go back go the new and original files Drop them into MKV Tool Nix and make an MKV using the 10bit x265 file I just made and lossless audio. . 7 mb/s the REMUX will have a much higher bits/pixel ratio and all things being equal would look better. It's worth mentioning that there is lots of content out there encoded in 10 bit H265 that is SDR (REC. This left shifts the pixel values left by 2. 1080p video uploaded to YouTube is transcoded and compressed using the avc1 codec and with pretty low bitrates. In general, h. HEVC-PSA (2. if using hevc for 4k vs h264 for HD then yes can still be better with a lower bitrate because hevc bitrate is about (not precise but close enough) double the efficiency of h264 so a 1080p 20mbps hevc file is equal in IQ in theory to a 40mbps 1080p 40mbps h264 file Yeah, exactly like you mentioned with 4K, 1080p is better on Blu-Ray as well. x265-Asiimov (3. This is very wrong Generally (for 1080p WEB-DLs), the order goes like this, AMZN>ATVP>MA>DSNP>NF>PMTP>HMAX>HTSR>iT Note: DSNP is best for Animated Content. While that is 4 times the bitrate, is the image any sharper? Much of the bitrate will be consumed by the higher color depth. Will dropping to 10 bit make a difference on a 1080p(/i) blu ray? My current set up is too old to trust comparison, and want to futureproof for new tv. 3GB) /r/StableDiffusion is back open after the protest of Reddit killing open API access, which will bankrupt app X264 vs x265 1080p . I am able to do this but only using 4. Every movie is going to be different. The source files are already 8-bit so I don't see why there would be anything gained with 10-bit. 265 from a Plex server on the HD, the device is at it limits though and you can tell its working hard to play the video. 1/10 bit HDR NAS Thought on 1080p vs 1440p View community ranking In the Top 10% of largest communities on Reddit. 5gb Cake 720p size 2. 10 hello, im now having a problem, i want to encode some Blu-ray as well but im stuck between 2 qualities 1080p Blu-ray x265 HEVC AAC 8 bit and 1080p Blu-ray x265 HEVC AAC 10 bit. While the right side you can't read a single letter from that pack of macaroni, on the 5. relevant xkcd So have a quick look at the colours on the 1080p stream vs the 4K stream and see if the 1080p looks better. Kind of a garbage in/garbage out scenario. So definitely use that if you aren't already. ) in 1080p Bluray x264, then the others I'm not too fussed about in 1080p x265 rips. Its an entirely different thing if you simply don't have the money for the cheapest 4K TVs, but a 5 years old 1080p TV from some used electronics store or online store will easily outclass any shitty new 1080p TV you would be able to purchase now at the same price. Which is fine because 10bit color/HDR and higher bitrate. I generally want the quality closest to source. Well, someone's been illegally downloading movies, I see They're the file's metrics. no reason to buy a nice monitor if your hardware cant push it. Generally any 1080p movie above 5gb will be fine. I booted up tomb raider 2013 and saw how crappy it made 1080p look that I instantly got over 120hz. I got a Zephyrus G15 and it's 1440p 165hz Screen is a revelation. 2 10bit using the S&Q mode but I want to slow it down in post Anddd also keep the audio Is this even Convert the 8-bit video to 10-bit video. 1 is almost indistinguishable from the source, but a tad softer and possibly introducing additional artifacts. *** Also, all these comments saying 'gaming renders at 720P' are besides the point. My 1440p monitor can support the 10-bit color depth but at 120Hz max. But what about in reality? Does it make a noticeable difference? It's just more common to see HDR have have 10-bit. Both discs were encoded to HEVC as CQ RF 18, with an encoder preset to Medium, and audio always as passthrough. 1080p has 2. It makes a HUGE difference in quality for me, with nature videos that have permanent movement and a narrow field of sight with high complexity. Contrast is ok. Or check it out in the app stores Which One Is Better, 10Bit 1080p Or 60FPS 1080p? They've both the nextgen codecs. I see often, that 1080p rips often use as source the 1080p standard Blu-ray. That means you can have a 1080p blurry video with lower bitrate. Honestly, audio is the biggest deciding factor for me. Would it be better to use the 4K UHD Release of the same movie and downscale it with handbrake to 1080p. Both give you larger file sizes and need more processing power than 8 bit HD, which can generally be edited on a phone. think my best bet is keeping the TV shows i want to keep and rewatch (GoT, band of brothers etc. Now, 4K Sourced 1080p WEBRips are even better than WEB-DLs if the encode is good (from NTb, AJP69) HDR didn't come around until 4k releases using 10bit color, but not all 4k discs are 10bit/HDR. I tested with a number of movies, old and new, the results are the same. It sucks that we still dont have anything over 60hz in 4k so right now you gotta chose res or hz. x264 : WEBDL is the Source (not recorded, but an original copy of what is stored at disney+; 1080p is the resolution of the video stream; aac+x264 same as before. 1 MONOLITH) [QxR] 15. H265 10-bit for 1080p(/i) Blu Rays The unofficial but officially recognized Reddit community discussing the latest Definitely 4k web-dl. 3 GB Vs The Witcher (2019) Season 1 S01 (1080p NF WEB-DL x265 HEVC 10bit AAC 5. Small channels get avc codec on 1080p, which doesn't need that much processing power when youtube reencodes your uploads. The difference between 1440 and 1080 is huge, and with 1080p ur mostly stuck at 24inch cuz anything bigger and it's becoming pixelated, even i can see the pixels clearly in some games (old 1080p 24inch monitor) Whereas with 1440 u can get a 27 monitor and it's PPI is still better compared to 1080p 24inch 10bit HDR vs REMUX . So your display isn't relevant in this comparison. i run 1440p 165hz 27" but i also have a 6900xt to push it and a 3080 12gb to use for certain games and when i stream. But apples to apples, it would then depend on how you watch the content. 4K video gets transcoded using vp9 or av1 which are much better codecs So to make a proper comparasion I took your 5. With 1080p SDR, you did not need 10 bit color depth. Officially the HD does 1080p H. 7gb Truffle 1080p size is 5gb approx TGx 720p and 1080p is 300mb to 1gb Psa 720p and 1080p is 500mb to 900mb MeGusta 1080p is 900mb but i don't give a subtitles sometimes. Without hardware support it's not going to see much adoption. 1440p vs 1080p for fps games so im wondering which is better i mostly play valorant and cod , 1440p 165 hz or 1080p 240hz is it really worth upgrade to any of this for 1080p i see ViewSonic XG2431 24 Inch 1080p 240Hz as option 1440p idk what is better any suggest will be appreciated 1. 264 8-bit 1080p 60fps VP9 10-bit HDR10 1440p 30fps VP9 8-bit 1440p 60fps HDR AV1 10-bit (Not GPU accelerated decoding but still completely smooth) 1440p 60fps VP9 8-bit 4K 60fps VP9 10-bit HDR10 Where it stuttered and could not play smoothly: 4K 60fps AV1 10-bit HDR10 No other player could, of course, not VLC either. Also I will be shooting at 29. A reddit dedicated to the profession of Computer System Administration. if you are running a medium settings 1080p rig keep a 1080p monitor. Basically some of the extra size will have to go to "sharpness" information, rather than "color" information. 8-Bit 144Hz vs. 84 GB (2 330 kb/s, x264 8 bit) vs 3. If you encode from your 4k HDR10 source to 1080p you will have a smaller size than your 2GB file. Infuse can direct play 1080p H. P. Playback performance at 10 bit is probably higher due to ALL-I encoding vs the LongGOP encoding. 265 vs. I know that HDR could be an issue, but would the quality be better or do I just waste cpu time and should take the 1080p blu ray rip. I went from 1080p 60hz to 1440p 144/165/hz with my 7800 XT and got a huge boost in FPS and game performance. but I feel the same way about 60fps vs 140+. Life action vs cartoon. 1. Ok Imma try my best I'm looking at aset of a dvd show remastered and there are two options to pick from x264 and x265 10bit. Once upon a time, x264 was the de facto standard for 1080p. Reply reply More replies More replies - 10bit vs 8bit, 10bit encodes are smaller - Not the same source, not the same resolution and not the same codec (h264 vs h265), so it's difficult to compare, it's like 2 different movies. 1 7. That said I'm incredibly jealous of your 10 bit display. If bit rates are ok for a 4k video web-dl is the way to go. I've always used 8 bit encoding in x264, because 10 bit was barely supported, but it seems 10 bit on x265 is much more supported. Not sure which movie this is, but I can't imagine why not to get the 4k copy. Thanks in advance! so done a bit of digging and found plenty of 1080p x265 with 24 frame rate. Reddit . I have just ordered a 10 bit 8k camera as I am greedy and want both. 10 bit 4:2:2 1080@60 @ ~ 200mbps mp4 is very similar in size to 8 bit 4:2:0 4k@30 @ ~ 100mbps mp4. problem is these rips dont detail the source. Busy content or not. This is because there's more overall information in the 1080p image. I got that part. BluRay. If that’s part of your work flow or something you plan to do 10 bit will make a big difference over 8 bit. View community ranking In the Top 1% of largest communities on Reddit. 66gb On my Sony A7 III I shoot the majority of my footage in 1080p 100fs. Some linux iso's will be titled "x265 HEVC 10-bit", but the word HDR is missing. 1080p can be upscaled to 4K by turning each 1x1 pixel into 2x2 pixels because 1080p is exactly 1/4th the size and proportions of 4K. 24in 1080p 144hz - » LG 24GN650-B - IPS, fast response time, 144hz, HDR10. Also, to my eyes there's BARELY any difference between a 500mb 1080p video and a 1. Why are you in a position to choose either? Now, you can use a higher bit depth for video encoding, and x264 currently allows up to 10 bits per channel (1024 levels and 30 bits per pixel), and of course that allows for much higher precision. And X video streams at 1080p/1100 bitrate. Curious why Sonarr is detecting some videos, as being 720p, when they should be 1080p. WEBDL. 2160p. 5x as many total pixels as 720p, so assuming the video track of your 1080p source is at least 2. true. ” Settings -> Camera -> Formats Pro res uses a bonkers amount of data. Im stuck with the 128g 13promax and been wondering if I should shoot at 4K with the 10bit extreme setting or go for that sweet bitrate with the ProResHQ 1080p (cant do 4K on my model). S. A 1080p vs 4K hdr file doesn’t always look super different in modern movies, especially when the 1080p has a good bit rate. But the size difference is a lot the 8 bit is about 113gb but the 10bit is about 160 gb Nerds on Reddit don't like to admit this, but for most casual recordings of friends and family 1080p @ 30 FPS is usually good enough. 5 Mb/s I don't care about the sound quality because my setup can't really take advantage of DTS-HD or Atmos Welcome! For PC's and general gaming, here are the most frequently recommended taken from reddit posts. 1080p. 3k vs 1080p image, scaled the right size to the same size as the left side, and you can clearly see the difference. WEBRip. Plus the higher bit depth and bit rate recording options of either resolution make 1080p much more palatable. Show. Dec 29, 2004 · Based on a quick check, HandBrake doesn’t appear to have an Apple device preset for H. Kong. 1080p isn't encoded with HDR so the colors won't have the same range. 97 Resolution: 1920x1080 Run Time 55:07. Of course I realize that there are more seeders for the smaller 1080p files and frankly these files look better then decent on my OLED television but I'm wondering what easily understood steps I can take to try and get a decent connection Get the Reddit app Scan this QR code to download the app now 144Hz 1080p vs 1440p Hz? Just picked up an Acer XV272U for $230 refurbished directly from Acer on Get the Reddit app Scan this QR code to download the app now. im debating on getting 1440p over the 1080p even though ill have lower refresh. A 4K file at low quality will have less information than the 1080p copy. X265 10-bit is what you want. TCL Class 3 Series looks OK, but I noticed it comes in 720p and 1080p versions. Most UHD content is 10-bit. not trying to be mean just giving advice native is the way. tigole makes great encodes and the special featurettes are great, but x265 is quality wise worse than their original counterparts. Get the Reddit app Scan this QR code to download the app now Even my kids 100 dollar tablet can direct play 10 bit x265 Maybe you can save 2-4GB on a 1080p 10bit seems to be the minimum in this moment. Or check it out in the app stores Godzilla. 1: surround sound type (5 speakers plus a subwoofer) Joy: likely the person who originally ripped the video for illegal distribution ;P Some. The Back to the future series from 10 gigs to 5 gigs. Nov 30, 2019 · In August 2016, Netflix published a comparison of x264, VP9, and x265 using video clips from 500 movies and TV shows using 6 different quality metrics and found that both VP9 and x265 have Jan 20, 2021 · In theory, 10-bit gives each pixel the ability to pick from over 1 billion shades of color; compared to 16 million shades offered by 8-bit. I’m currently planning to shoot some footage that cover day and night lighting. In my case, Y video streams at 7000 bitrate, very good video quality. Bitrate dictates file size. The 4K looks like it was HDR and was converted to SDR which makes it lose a lot of it's colors. With HDR now, 10 bit color depth is part of the specification. 4K vs 1080p is a factor only you can decide. Get the Reddit app Scan this QR code to download the app now. From 4:4:4 to 4:2:0 8bit content was indistinguishable from without DSC. 265 is relatively well supported by players. Can be delivered in 1080p, if it makes a difference, what difference? I currently have a 1080p 240hz monitor and i usually play fps/shooting games, csgo / valorant / r6 / apex legends / cod warzone etc. From what I've read/watched, newer cards are optimized for 1440p so when using lower resolutions like 1080p you can actually suffer on the performance area because of downscaling In fact, using a 1080p projector, which supports a 4K input, I have heard yields better results than using a 1080p native source. 97 in 4k (either DCI 4k or 2160p) at 400Mbps or 100Mbps for 10 bit and 8 bit respectively. Hence the choice between the ones in the title. 4098 As far as HDR, most 1080p 10bit HEVC content is reencoded from SDR content, the 10bit color space is just used as a trick to improve the compression ratio slightly. No. The home of Get the Reddit app Scan this QR code to download the app now. x265. 5x the bitrate of the video track of your 720p source, you'll definitely see a better result with 1080p downsampled to 720p than just going 720p to 720p. Stadia 1080p vs PC You can have X video be streamed at 1080p resolution, and bitrate be at 500. Good 1080p and good 4K are hard to tell apart on a laptop sized screen, but YouTube treats 1080p and 4K video very differently. 10-Bit 120Hz . Is that HDR (and the word is implied because HDR is 10 bit?) ?? E. I was thinking of making the jump to 2k resolution from 1080p. DDP5. My 10bit HDR encodes to 1080p are half the size at 10bit compared to 8 bit. " I'd assume ALL-I HD is markedly worse than ALL-I 4K, than HD RAW is to 4K RAW simply because little to no compression is involved with RAW. Vvc vs av1 is Google open source vs paid mpeg proprietary codec vvc (h266). So you keep all the original information. Seems a lot of people decided to get a bit of both and get 1440p 144hz though. 4k vs 1080p at the same size Are they the same codec? If the 2160P file is x265, while 1080P is x264 then you could very well get better quality from the 2160P file, but it's hard to say for sure. Get the Reddit app Scan this QR code to download the app now Hdmi 2. Real world everyday scenario, not on a film set. A7IV 1080p 120fps 8bit vs 24/60fps 10bit differences - help needed please! A7 IV can shoot “A minute of 10-bit HDR ProRes is approximately 1. A 4K h. HEVC will pass through 10 bit color options just fine, but you have to specify the HDR metadata in the encoder parameters or else HDR signaling will not work on most devices and it'll play back like an SDR file with washed out colors. There are even instances where a 1080p Blu-Ray will look better than a 4K stream. I have a AMD 5900x processor and rtx 3800 graphics card and get average 200fps on 1080. Therefore i wonder, is there any upside to allowing Bluray-1080p encodes? The only one evident to me is HEVC. ) How far will your head be from the screen during normal play? I would say skip the 1080P if you sit ~7 or more feet from the TV. The better the original content is, the better the output can look on most projectors. HDR 15GB. reReddit: Top posts of August 21, 2018. But, 1080p should not look significantly better on a 4K projector. " Obviously plex cannot tonemap HDR while transcoding but I'm trying to figure out if these 10bit files will casue a similar problem and wash out colors? I was in that same place until I saw 4k vs 1080p side by side. It's based on screen size, distance sitting, and your eyesight. Bitrate decides the quality of the 1080p video. Disney. Or check it out in the app stores TOPICS RM4K [1080p Bluray x265 10bit) . i never really tried 144hz since i immediately jumped from 60hz to 240hz and i never really tried 1440p. Even if you're among the 1% with a 10 bit display on a workstation videoboard on a Windows computer chances are the image that came down the internet was 8 bit. So I decided to do a (unscientific) compression test myself. you don't add 10bit to a video and make it magically look better. Get the Reddit app Scan this QR code to download the app now 8 bit vs 10 bit Encoding Confusion (Again) Settings PSNR VMAF (1080p) Size Speed 8bit, 6 44. Theoretically you'd need ~12K for 1080p, but you hit a tradeoff point between resolution and encoding quality that works well. 8 bit vs 10 bit and sdr vs hdr has impact on the ammount of colour displayed, so the more the better, also the more the bigger the file will be. Tadda!!!!! I figured it out , I now have my signal running at 1920x1080 119hz and 12bit 422 color space. Film grain vs digital smooth. 7gb Ntb 1080p size 5gb approx and NTb 720p size 2. Question H. If you have a 65 inch screen and know what banding and aliasing are, should go for remux either way. 2. Infinity War: 1080p vs 4K 10-bit vs 4K Atmos. I am only doing 10 bit to 4k HDR source material not regular 8bit blu-rays. will video quality be better if I put the TV in SDR mode or HDR mode? Lets say I have a few files to choose from. Or check it out in the app stores Handbrake settings for anime: 1080p 10-bit x265 . With the 4K disc, I limited the resolution to 1080p, and encoded to HEVC. For subsampling, essentially all video content, from streaming to 4K Blu-ray, is encoded 4:2:0. It is nuts I do not get it and I checked the files they are full length. Their 1080p 10bit is decent quality and 720p 10bit still ok. . What are the best quality settings to use to make blu ray compliant h264 files? Rf 16 on film tuning high profile 4. Is this due to a lack of real benefit over H. 2020 colour space). You see the difference when you start to heavily grade your footage. UHD. H264 10bit vs H265 10bit (4K HDR) HEVC files always transcoded to x2y4 1080P. If your sources are that grainy, you're still going to get bigger file sizes, but still more palpable than x264 8bit file sizes would be IMHO. Best combo of performance and value. 10-bit looks noticeably better than 8-bit, 12-bit not so much better than 10. etc etc. hdr10 is 10 bit, dolby vision is 12 bit. 1: surround sound type (5 speakers plus a subwoofer) Joy: likely the person who originally ripped the video for illegal distribution ;P Well, someone's been illegally downloading movies, I see They're the file's metrics. 79 GB (4 998 kb/s, x265 10 bit). With that said, having fewer pixels needing to be filled by the bitrate means that it will arguably be able to look better despite the lower resolution. 5gb and 1080p 5gb In an effort to save space I've been looking to transition some of my library to x265, and in doing so I've noticed there are quite a lot of 1080p x265 releases that are tagged as "10bit. The XF243Y OC's to 165hz. The Truman Show: 1080p Remux - MPEG-4 AVC Video 35825 kbps 1080p / 23. ) If you use the TV as a monitor for your computer, go for the 1080P, regardless of (1). Nowadays x265 can do just as good for 1080p or less content. If you’re downloading films and they say they’re 1080p 10 bit then they are usually 8 bit encoded as 10 bit (it’s something to do with the codec). g. A minute of video will be approximately: “Standard” Cake 1080p size 5. i'm not sure about joy's work on 4K, but her work usually bests other encoders in 1080p version most of the time, like if you compare tigole's work and joy's work for a 1080p movie/series, joy takes the win almost all the time Due to bandwidth and storage limitations, all my content is 1080p at a bitrate of about 5-10 mb/s, which is about the bitrate that most streaming services use for 1080p. Higher bit depths tend to get a little benefit in encoding efficiency due to the higher precision DC / low frequency components. Don’t get why it’s on the 4K one on the top though, it’s more relevant to the bottom ones as to differentiate it to a previous version that was mastered in 2K or to a different cut. 1 Vyndros 11. For example: The Untouchables, 87 AVC DTS-HD MA 6. 976 fps / 16:9 / High Profile 4. As a simplified example, if the 8-bit luma values were 1, 1, 1, and 0, then after step 1 the 10-bit values are 4, 4, 4, and 0, and after step 2 the new 10-bit value is 3. If you are using software/hardware with a crappy scaler, 1080p will probably be technically better. For just viewing a movie, you aren't going to be able to see a difference between a 10 GB 1080p encode and a 50-60 GB 1080p remux, even pausing and comparing side by side. One thing to consider would be that 4K is usually "encoded" in 10 bit colour range meaning it can have 4x the colour range of 8 bit, 256 colours vs 1024 colours. 2. 10-bit or higher h. A7S III moved to 10bit, latest iPhones recording in 10bit, etc. X265. 1080p Blu-ray is ok too, Blu-ray is usually far more better at audio quality tho. 264 file. Get the Reddit app Scan this QR code to download the app now + 10bit will result in different colour Hero 9 black. Reply reply More replies I've never had a client ask for anything bigger than 1080p. 264, but with a file size saving of 30-40%. 10Bit. 4k, low bitrate vs 1080p high bitrate Days and days to attempt to DL the 10bit HDR 2160p files whereas the 10bit HDR 1080p fly off my computer. 2 10bit it only gives me the option of 25 and 50 rec frame rate (I've seen people shoot 1080p 4. I find it very hard to go back to 1080p now that I've upgraded. 5mb/s and 4K WEB-DL 14. A 720p BluRay might be better than a 1080p WEB-DL in quality. That a 1080p remux 40mb file will better than a 4k file at 8-15mb The problem is compression. X265 is far more capable than avc/x264 so consider that too. I know the new 4K Blu-rays to be released soon will completely replace the current 8-bit colour depth (16 million colours) with the new 10-bit, (1 billion colours), but do the current slate of 1080p Blu-Rays have 10-bit output. If it does, maybe this is the issue. 10 bit SDR content should be displayed like normal on an 8 bit screen. You definitely want 10-bit over 8. But: Most graphics cards and display devices don’t allow more than 24 bits per pixel. The 10 bit OLED at QC at work is amazing. Year. 24in 1080p 144hz - » AOC 24G2 or Acer Nitro VG240Y or Acer Nitro XF243Y - Similar price with Freesync. 265 will deliver the same quality as h. Just do whatever works for you then, if it looks and sounds fine then don't worry about it. In my experience, I'd say that with AV1 there are tangible all-around visual improvements all the way down to 128kbps, compared to previous codecs, even for SD/720p content (just at a lesser rate than HD). As to HD vs 4K on the fp, my insight is unfortunately "it's a little less sharp. So if you have a 10 GB 1080p and 20 GB 4K the difference will depend on the quality of the scaling and personal preference. imgur. A lot of movies are encoded down to the 2GB range, and that is actually fine for most movies it its x265, though for stuff that is supposed to be visually impressive 5-10 GB Would a 10 bit prores recording with an atomos star/ninja -220Mbps (10-bit) be better or a 4k 8 bit from gh5 in h264 (150mbps) compression be better? Final output for web. Too bad it only takes video signals. 1 35 GB The untouchables 87 HEVC 10 bit DTS-HD MA 6. 3 GB I have heard Vyndros does a pretty good job haven't heard of the other encoder. 3Mbps for example is passable for 720p (though not great) but IMO is too low for 1080p which shouldn’t be under 8Mbps if TL;DR: Trying to cut file size when encoding TV eps but want file closest to source. 1080p compressed is closer to 720p quality, so if you get the 4k version, the compression ends off with higher quality on a 1080p monitor than the 1080p version. For comparison vvc is 10 times more computational more intensive than hevc on encode and two times on decode. vs. However, I could consistently notice it when applied to 4k 10bit 4:4:4 content. On a similar note, can an Instagram viewer tell if a video was shot in 8-bit vs 10-bit color? I don’t feel that they could. Anime is often 10-bit in an SDR space, because animation tends to have large color gradients which benefit from 10-bit. 7 GB for 30 fps at 1080p and 6 GB for 30 fps at 4K. With the 1080p disc, I encoded to HEVC, keeping the 1080p resolution. Compression normally doesn't affect the colours to the extent you are describing. For example, my rip of Reservoir Dogs is 1. Downscale the video from 4k to 1080p. Bitrate is the amount of data that is being presented to you on screen at a given moment. 07 billion colors just stick with 1080p. I go to the reported media info in Sonarr, and Sonarr reports the video file: Audio Channels: 2Audio Codec: AAC Audio Stream Count: 1 Video Bit Depth: 10 Video Codec: x265 Video Fps: 29. 1. Im mastering at 1080p btw. Full story: I rip my 1080p blu-rays to then stream through plex. 10bit 1080p video vs 8bit 4K video Linus Tech Tips - Reddit vs PC Part Picker vs LTT Forum – Where Should YOU Go for Build Advice? November 18, 2023 at 09:50AM Get the Reddit app Scan this QR code to download the app now. This averages 4 10-bit pixel values into 1. To conserve space, first and foremost, make sure you're using x265. In addition, an 8-bit file works with RGB using 256 levels per channel, while 10-bit jumps up to 1,024 levels per channel. Get the Reddit app Scan this QR code to download the app now 1080p remux vs 2160p X265 encode? Question Movie. 10bit. However if you tried to play 10 bit HDR on an 8 bit screen, it would look dull/washed out. Just like h264 vs vp8 and h265 (hevc) vs vp9. When your camera can shoot both 1080p and 4k at the same bitrate, how is the 4k Video not much worse in quality than the 1080p video? It has to compress over 3 times the pixels while the filesize stays the same. This means a 10-bit image can display up to 1. For same encode (encode is the process the original file goes through to get smaller, it compresses video and audio, making the quality worse than the original source), 10bit hdr>10bit sdr>8bit sdr. 2021. FWIW for most people I'd probably say just stick with 1080p. Best bang for the buck right now, unless the AOC 24G2 is in stock 1440p 144hz As computer-macine said, don't use x264 10-bit. With 1080p REMUX 25. Good post here for monitors typically recommended for PS5/Xbox. mp4 Get the Reddit app Scan this QR code to download the app now HDR vs SDR vs 10 bit vs 8 bit . Should I use my bandwidth on 720p 2. that really depends on how things got prepared and transcoded for the bluray or the It’s normally for 1080p content shot on film that was restored for 4K or has a 4K DI that was downscaled to 1080p for a good picture quality. I keep everything in 4K that comes in 4K. However, I've been using MacBook Pro's for 13 years now and going from my 16" MBP to my 15R3 1080p Alienware was just sort of disappointing. Hi Reddit, I read mixed discussions on the compression options of the R5. The HDTV in your home is likely displaying similar quality content from your cable provider. 1 The Truman Show: 2160p 4K WEB-DL DV - 3840 x 2076 - 23. Native 1080 HDR content does exist but isnt very common and im not sure there are (m)any 1080 HDR displays anyway so idk why anyone would want HDR without 4k also After reading the nvidia blog post, it seems the sample use case is watching a 1080p video on a 4k monitor. For 8 bit content I absolutely agree. EG: a 32" 1080p TV will 'look' sharper than a 52" tv as the pixels are closer together. Not many transparent 4K encodes are made at the moment, not until the encoding gets faster, and more discs with proper 4K details come out. This allows for smoother color gradations, avoids banding, etc. For NF, Prioritise 10bit HEVC. So wouldn't a better comparison be 1080p -> 4k vs native 1080p? In this example, it's not meant to replace or improve on a native 4k stream; it's just supposed to be better than the original 1080p output. Most using this feature should probably be using an external drive to begin with. 37 votes, 56 comments. Displays that are natively 8 bit can display hdr with dithering techniques. 1080p 280hz (VS) 1440p 170hz - At the same price. When the video starts swipe down and change the play back settings to convert automatically, you should get smooth playback. Turns out what I had to do was download custom resolution utility and go into the HDMI metadata or something like that after I edited the TV's 1080P signal it already has to 120hz , went into sub settings and enabled 30 and 36 bit color , hdr sampling and the 422 color space for that signal The Witcher (2019) Season 1 S01 (1080p NF WEBRip x265 HEVC 10bit EAC3 5. 265 with a 1080p source. I can see a difference between a 5gb 1080p and a 30gb 1080p but it's not like the 5gb one will look shit. It all comes down to bitrate. 3k image you can read that per 100 grams it has about 71 grams of cartbohydrates. 10-bit h. reReddit: Top posts of August 2018. Hello everybody, I was wondering how much better is 10 bit encoding over 8 bit, in terms of x265 encoding. Linus tech tips does a great video on how 1080p can look almost as good as 4K with a proper bit rate. Even then, add a little bit of resolution for 2k (2048x1080) vs 1920x1080 1080. The warp stabilizing nature of e-stab works by cropping in and upscaling the net result back to your set resolution - which creates the slight One is a normal 1080p blu ray x264 while another is a 1080p 10bit hdr amazon rip. 264 isn't as common due to less player support of it. If you are filming in any sort of difficult light or cannot get your exposure right and set colour balance before you shoot, then go for the 10 bit. If 10-bit erases some color banding in the sky when you view it with a nice MacBook Pro in 4K in Final Cut Pro, then who’s to say it doesn’t re-appear on Instagram when it is shrunk to 1080p on an iPhone 10? There are more variables than that, such as encoding settings. mjywz sxpp qtbvy oxmja yxtlg nftnz gekscb fckr pml peczh