My video card will eat 300+ watts of power happily but I have tuned this game to sip power and so my usage varies from 150watts to around 220watts (peak avg being 170-200w). That being said- even as a PC tech of three plus decades: Hogwarts Legacy is impressive and is ‘very rewarding’ to experience from a hardware enthusiasts point of view.įortunately I like the Hogwarts universe and so am happy that the software that is most impressive on my PC has great music/art values and an interesting world to visit.īetter than running 3D mark and great for keeping room warm in winter. I do believe that this game pushes hardware heavily. (I also brute forced Crysis at Ultra settings on release but ran at 720p due to owning a ‘midrange’ nvidia 8800gt at the time). I know RDNA 3 can spit this game out much better and the new generation AMD graphics cards can do this title without being right at the edge of the thermal envelope.īut as a gamer since the seventies and a PC system builder and overclocker since the nineties- this game is the nicest ‘tech demo’ I have ever had the fortune to play. If I frame cap and tailor my fan curves I keep my card from crashing in this game. RDNA 2 can brute force this game (no FSR ‘upscaling’), but sustaining 1440 with everything Ultra (and Raytracing Ultra etc) is very taxing. Using FSR 1 mode and RT Shadows off I now play at 1600+ Resolution woth everything else set to Ultra.īest use of HDR (panasonic OLED) I have seen in gaming (Returnal and Demon Souls were ‘great/good’, but Hogwarts- with spell effects and a masterful understanding of ligth by the art department has Lumos correctly light bugger all and incredibly well depending on darkness the player is experiencing… (depth of picture and immersion being 10/10) (still a few spots where my framerate drops by 1/3rd super briefly, but these are spots where some tweaking is needed to game and I can count them all on one hand over a lot of gaming time (so far-30+ hours in the last week or so)įSR 2 adds a lag/latency hat FSR1 mode does not. Turning RT shadows off removed all graphical ‘immersion breaks’ I was experiencing, and removed basically all frame drop from the most difficult to render locations. I now lock my GPU fan to 2/5ths speed (~1600rpm) and I have a totally quiet PC that runs game flawlessly for many hours at a time. (beat this by running overlays and keeping task manager open whilst ‘shader pre-caching’ happened and I could get into Settings and turn on my frame rate lock again. I have seen a few other titles let me hit 110 degrees junction temp (all Raytracing titles eg Portal RTX and Witcher 3 revisited) and I have booked my card to get ‘serviced’ (fan repaste hopefully will help?!?), but simply locking my framerate and keeping an eye on temperatures (a fan curve that keeps me from going above 112 degrees celcius at junction temp).and voila- everything is fine.Īccidently left framerate uncapped – couldnt get game started without crashing. Last week I swapped two fans directions around (an inlet became an outlet and vice versa) this dropped GPU temp by 5 degrees celcius.Ĭombined with lowering power into my GPU (and playing with fan curves) I could now game happily. Now- I did have to tune my graphics card DOWN in performance otherwise the junction temperature (worst temp spot based on ‘multiple points’ is my understanding) would exceed 110 degrees and either the PC would shut down or game would drop to desktop (depending how quick that twmp raised).īelieving the game broken I stopped playing for quite awhile. Obvious PC Ultra settings havent been ‘widely’ tested most opting to run without raytracing… Now I do believe this game needs some serious polish especially on PC side of coin. I mostly stopped playing after I saw patches make game graphics massively worse (whingers ALWAYS net graphics gimping ala squeaky wheel gets the grease).(kingdom come deliverance is another perfect example of top tier graphics being downgraded ’cause users say “every other game runs fine on ultra settings(but not YOURS)!”. I play the game with everything set to ultra, and for the first few months simply bruteforced the graphics (no FSR 1/2 turned on). I use AMD graphics card and it is a known thing that the raytracing smashes AMD GPUs poorly in this title… (yes, worse than most! and not typical framerate variation vs nvidia counterparts etc.) I’d second this info (to a certain extent).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |