NoGripRacing.com

Go Back   NoGripRacing Forums > General Discussion > Computech

Reply
Thread Tools
Unread 2 March 17, 01:21   #51
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
http://wccftech.com/amd-ryzen-7-1700...ck-benchmarks/

1700X ( $569AU) ----------------------- 6800K ($609AU)

GTA V (DX11) 139 FPS 145 FPS
Alien: Isolation (DX11) 194 FPS 200 FPS
Battlefield 4 (DX12) 111 FPS 115 FPS
Ashes of The Singularity (DX12) 55 FPS 56 FPS
Civilization VI (DX12) 62 FPS 79 FPS
Doom (Vulkan) 127 FPS 123 FPS


Nothing between those, AMD is more then just getting close

I be interested to see SLI 8x8x vs 16x16x
Intel picks up free 5% there doesn't it

Really only affects people happy to pay a extra grand for 5% gain I guess


AMD will trounce encoding for anyone doing more then a few hours a day ie: how much is your time worth
So, OC'ing the 1700X makes it run hotter and consuming more energy in those conditions than hoped, even if it is on par (and no worse) when compared with Intel i7 CPUs, in that aspect as well.
The gaming performance is good but Intel seems to be a wee little better still.

Got to say... I'm not as impressed or quite sure with that first review as I thought I'd be, at least regarding OC'ing and gaming on that 1700X. Maybe other reviews will show different impressions?
I'm waiting for the reviews of Anandtech, Toms Hardware and GamersNexus. They usually are on point and extremely reliable.

Quote:
AMD’s Ryzen 5 Processors Arrive in Q2 2017

AMD’s Ryzen 3 Quad Cores Arrive in 2H 2017
http://wccftech.com/amd-ryzen-5-3-la...rmance-reveal/

Oh bugger. So gamers on a budget, which usually compose the majority, will have their most awaited CPUs waiting for release (and respective in-depth reviews) a bit longer afterall.
.

Last edited by DucFreak; 2 March 17 at 01:42.
DucFreak is offline   Reply With Quote
Unread 2 March 17, 02:14   #52
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

It seems like a wierd way to do things

I guess they trying to get Intels users that are sick and tired of paying high prices to buy a even dearer CPU .............huh

lol



You would think if the 4 core are as fast as 7600K/ 7700K they would be shouting from the rooftops.
DurgeDriven is offline   Reply With Quote
Unread 2 March 17, 23:56   #53
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
By far one of the most interesting and concerning points about today's launch of the AMD Ryzen processor is gaming results. Many other reviewers have seen similar results to what I published in my article this morning: gaming at 1080p
https://www.pcper.com/news/Processor...ng-tests-Ryzen

I don't get remarks like this ........

Quote:
I'm waiting for the 4/4 or 4/8 to be the gaming chips, these are mostly overkill.
If a 1800X ( $699AU) is barely faster @1080 then my 7600K ( $340AU) overkill wow

Intel will still drop prices as 1080p gamers would only be a small slice of total sales.

If more figures this week confirm this odd 1080p performance gap a discounted Intel will probably be as good a value..IF you only run 1920 like me ?

Quote:
For buyers today that are gaming at 1080p, the situation is likely to remain as we have presented it going forward. Until games get patched or new games are released from developers that have had access and hands-on time with Ryzen, performance is unlikely to change from some single setting/feature that AMD or its motherboard partners can enable.
Bummer

P.S. What "sims" have ever had issues fixed in Nvidia or AMD driver releases ?


Why will that change for the next gen sims with either video card going forward ?

Same would go for motherboard optimization for sims, like as if that is going to ever happen

They only ever worry about developers sell the most product

Last edited by DurgeDriven; 3 March 17 at 00:18.
DurgeDriven is offline   Reply With Quote
Unread 3 March 17, 00:45   #54
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
Originally Posted by syhlif32 View Post
While I suddenly hope we get some progress and competition in the Cpu market I bought a i5 6600k for $205 back in December.
Overclocked it to 4.7 Ghz with a $21 cooler without touching the voltage.

While there isn't any hyper threading on the i5 I believe this is the market Amd have to compete with.
Something I remembered I forgot to mention

If you still using Auto Voltage you may be able to drop temp a few C
by finding lowest stable v.

Depending how good or bad the bin is can make diff in voltage required and therefore heat

Auto set my 2500K would do 5.1GHz on air in prime but use v1.45

Manual it was stable in prime 5.1GHz@ v1.375 and ran 2-3C cooler
1.35v @5GHz was considered very good for Sandy i5

I ended up settling for @4.5GHz with only v1.280


BTW...

My KabyLake runs 3C hotter then your Skylake clock for clock lol

That being said with GTX1070 @1080p am I really going to need or see a worthwhile gain with my sims @4.7GHz which is only 500MHz more for Kaby but 800Mhz for Skylake

Only amounts to lil' over 1/2 the gain you get.
At 4.2GHz I would close temps maybe that 3C

Last edited by DurgeDriven; 3 March 17 at 03:01. Reason: C for %
DurgeDriven is offline   Reply With Quote
Unread 3 March 17, 12:58   #55
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Quote:
Originally Posted by syhlif32 View Post
While I suddenly hope we get some progress and competition in the Cpu market I bought a i5 6600k for $205 back in December.
Overclocked it to 4.7 Ghz with a $21 cooler without touching the voltage.

While there isn't any hyper threading on the i5 I believe this is the market Amd have to compete with.
Quote:
Originally Posted by DurgeDriven View Post
Something I remembered I forgot to mention

If you still using Auto Voltage you may be able to drop temp a few C
by finding lowest stable v.

Depending how good or bad the bin is can make diff in voltage required and therefore heat

Auto set my 2500K would do 5.1GHz on air in prime but use v1.45

Manual it was stable in prime 5.1GHz@ v1.375 and ran 2-3C cooler
1.35v @5GHz was considered very good for Sandy i5

I ended up settling for @4.5GHz with only v1.280


BTW...

My KabyLake runs 3C hotter then your Skylake clock for clock lol

That being said with GTX1070 @1080p am I really going to need or see a worthwhile gain with my sims @4.7GHz which is only 500MHz more for Kaby but 800Mhz for Skylake

Only amounts to lil' over 1/2 the gain you get.
At 4.2GHz I would close temps maybe that 3C
As long as there's decent CPU cooling and air flowing in/out the case, he's safe with auto-voltage.

With that said, I'd definitely agree that adjusting voltages manually is the better route, along with keeping SST enabled. It should run noticeably cooler than auto-voltage. (and the CPU will appreciate it in the longer term)

I'd start by setting the CPU voltage to something around 1.35v (though I doubt the 6600K @4.7Ghz will work reliably below 1.39v~1.40v, it's not same as 2500K).
Then test stability with such CPU voltages (Lynx and IBT have been good to me for that, but I know most prefer Prime95).

Also, keeping the power saving features like Speed Shift (aka SST, which was previously know as "Speed Step") even when using overclocking is also a good move as well. This will make sure to only run the full voltages when it's really needed, therefore running cooler overall and saving on the electricity bill.
DucFreak is offline   Reply With Quote
Unread 3 March 17, 14:10   #56
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
https://www.pcper.com/news/Processor...ng-tests-Ryzen

I don't get remarks like this ........
Quote:
I'm waiting for the 4/4 or 4/8 to be the gaming chips, these are mostly overkill.
If a 1800X ( $699AU) is barely faster @1080 then my 7600K ( $340AU) overkill wow

Intel will still drop prices as 1080p gamers would only be a small slice of total sales.

If more figures this week confirm this odd 1080p performance gap a discounted Intel will probably be as good a value..IF you only run 1920 like me ?
Quote:
For buyers today that are gaming at 1080p, the situation is likely to remain as we have presented it going forward. Until games get patched or new games are released from developers that have had access and hands-on time with Ryzen, performance is unlikely to change from some single setting/feature that AMD or its motherboard partners can enable.
Bummer

P.S. What "sims" have ever had issues fixed in Nvidia or AMD driver releases ?


Why will that change for the next gen sims with either video card going forward ?

Same would go for motherboard optimization for sims, like as if that is going to ever happen

They only ever worry about developers sell the most product
This is a very interesting matter indeed, and one that is causing divisive opinions and long discussions everywhere, seen also in articles from experts (like the one you linked) inclusively.

First, we need to understand what is what, and where do the products belong.

The AMD Ryzen 7 (1700, 1700X and 1800X) which have been just launched and reviewed are to run against Intel Broadwell-E (8/16 cores models), for a fraction of the price(!).
And that's how you need to look at these - production biased CPUs first and foremost. Very versatile, and now at really tempting prices.

The proved principle so far is that "more cores = lower clocks" (and vice-versa).
Considering that most games (and simulators) untill recently have not been optimized for multi-core usage (and therefore biased towards Intel CPUs), that means higher clocks have been prefered.
And so it's no wonder that CPUs like the Intel i5 6600K/7600K and i7 6700K/7700K have been very competitive with games, having higher clocks for brute power (and more with overclocking).

But that you knew already.

The matter is, and like with production software, we're slowly starting to see games coming out with noticeable multi-core usage, so we're again at that point where we were in 2008.
Remember that time when dual-cores were fantastic (the "go-to" gaming CPUs) and quad-cores were overkill? ...well, check how it is now...
I don't think anyone would recommend a dual-core today, unless it's a G4560 for a very low budget gaming rig.

AMD will sell a bajillion of Ryzens CPUs with 6 and 8 cores (with 12 and 16 threads), being already very competitive (AMD Vishera wasn't), with more affordable prices than Intel.
Which means there will suddenly be a humongous increase of users in the mainstream market with such competitive processors that use more cores/threads than before.
Also, AMD Ryzen's tech is still very new, battling in Intel biased scenarios. With drivers and BIOS updates, and more games being tested (and produced) also with/for Ryzen, it will only get better.

What happens next is left to imagination. And where opinions diverge.

The way I see it, this can lead to market and tendencies shifts. Acceleration in the adoption of wider support for multi-core features (more so with Vulkan and DX12). And so on.
And this is why some believe AMD Ryzen will likely disrupt the gaming CPU market.

And, here's the thing, consoles like the PS4 and XboxOne have been using 8 cores/threads (AMD APUs) for a few years now. With many games going into production to be sold in all platforms (consoles and PCs), we've seen multi-core usage becoming more noticed and common in PC games, and that'll only increase, becoming the standard.


Meaning, we're getting again at that point as it was about a decade ago, in that, if you are very meticulous when building a new gaming computer on a budget, you should consider not just the budget limit but also the games that you're going to play with it in the next coming years.

You see, untill now, the games have been showing a huge bias towards Intel processors, and with little to none gains with anything above 4-cores/4-threads. But that is changing.

You may also remember the heated discussions of "Intel i5 (4/4 cores) versus i7 (4/8 cores)".
Noone doubts the latter is the better chip, the matter was mostly how to justify the big price difference.
While having more than 4 cores and 4 threads has not been necessary, noone doubts that having more threads is future proofing, and noticeable better for multitasking. The main issue has always been, Intel i7s are (too) expensive.

So, with the current practiced prices, and looking at these reviews of Ryzen 7, I can honestly see myself recommending an AMD Ryzen 5 1400X (4/8 cores) over an Intel KabyLake i5 7600K (4/4 cores), and even recommending an AMD Ryzen 5 1600X (6/12 cores) over an Intel KabyLake i5 7700K (4/8 cores), because Ryzen will be nearly as good for current games (and will overclock good enough as well), while having more threads and being more affordable.
.

Last edited by DucFreak; 3 March 17 at 18:14. Reason: spelling...(?)
DucFreak is offline   Reply With Quote
Unread 3 March 17, 15:42   #57
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

We don't see dual core anymore because they don't make them on the high end anymore. If they made high speed dual cores we would still see them, and they would still be pretty good for gaming. I think it'll still be a while until games use more than 4 cores. It's not because most people don't have more, or that the developers choose not to, it's because it's really, really hard to do. A game is a very poor example of software that benefits from threading. I'd say it's more that developers are being begrudging forced to thread games because that's the only way you can get more power in modern CPUs, no matter how inefficient. If any developer had the choice between a single core that was (for eg) 4 ghz and a quad-core that was 1.5 ghz (6 ghz effective) everyone would choose the single core, absolutely no question. My point is, for gaming 4 cores are fine for the foreseeable future, a better GPU is far more important than more cores.
MickeyMouse is offline   Reply With Quote
Unread 3 March 17, 16:55   #58
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by MickeyMouse View Post
We don't see dual core anymore because they don't make them on the high end anymore. If they made high speed dual cores we would still see them, and they would still be pretty good for gaming.
We now have a "high end" dual-core: the Intel i3 7350K.
It's 2 cores/4 threads, high clocks, and it's a "K" processor inclusively (so, OC friendly).
As good as a dual-core can get.

See the various reviews, they all converge - it's somewhat disapointing.
Such a shame because it was launched too late, it'd have been a sales bomb years back.
Very hard to recommend one (again, the Intel prices), when for 15 bucks more you can get an i5 7400 that will crush it in most scenarios, and more future proof, while still being great in gaming.

Quote:
Originally Posted by MickeyMouse View Post
I think it'll still be a while until games use more than 4 cores. It's not because most people don't have more, or that the developers choose not to, it's because it's really, really hard to do. A game is a very poor example of software that benefits from threading. I'd say it's more that developers are being begrudging forced to thread games because that's the only way you can get more power in modern CPUs, no matter how inefficient. If any developer had the choice between a single core that was (for eg) 4 ghz and a quad-core that was 1.5 ghz (6 ghz effective) everyone would choose the single core, absolutely no question. My point is, for gaming 4 cores are fine for the foreseeable future, a better GPU is far more important than more cores.
True, I'm not sure there is any simulator currently using more than 4 cores/4 threads.
With that said, the most recent versions of P3D, the latest betas of XPlane12, and AeroflyFS2, all seem to show that threads are starting to be used (how effectively/intense in future I have no idea). So a change and care for that is starting to appear.

Thing is, and I think I'm not alone in this, we use games other than simulators.
And, not to forget, we're still living (and will be still) in a scenario where PC is littered with popular games converted (both good and bad) from the consoles.

There are plenty popular games taking use of more than 4 cores/4 threads.
For instances:
  • Batman: Arkham City, Arkham Origins, Arkham Knight
  • Battlefield: Bad Company 2, BF3, BF4, and the recent BF1 (WW1)
  • Call of Duty: Advanced Warfare and Infinite Warfare
  • Civilization: V and VI
  • Dragon Age Inquisition (and Mass Effect: Andromeda, to be launched this month)
  • Metro series: Metro 2033 and Metro Last Light
  • Titanfall 2
  • Witcher 3 (and the upcoming Ciberpunk 2077)
I'm sure there's many more now.

The GPU is the breaking factor, agree with that for sure.
But you can't put the meshes, physics, sounds and respective uncompressed files (much bigger today) in the GPU. The CPU is just as important, then RAM and SSD/HDD also need to be good.

Detail is always increasing for all those areas, so it comes to a point where steady average framerate becomes complicated. I'd say very difficult even, if capturing video (common practice now, and for which more than 4 cores and 4 threads is ideal).
I think this is where multi-core is right now more important for the end user, as the spread of work in the cores is bigger. Only a matter of time until PC only developers realize (or feel forced to adopt) multi-core in gaming as standard procedure. They have to evolve.
The consoles hardware and OS work differently, but they've been able to handle such detailed games having such poor hardware also because multi-core is fully utilized with them.
.

Last edited by DucFreak; 3 March 17 at 18:16. Reason: list of games
DucFreak is offline   Reply With Quote
Unread 3 March 17, 18:20   #59
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

Quote:
Originally Posted by DucFreak View Post
We now have a "high end" dual-core: the Intel i3 7350K.
It has four threads, high clocks, and it's a "K" processor inclusively (so, OC friendly).
As good as a dual-core can get.

See the various reviews, they all converge - it's somewhat disapointing.
Such a shame because it was launched too late, it'd have been a sales bomb years back.
Very hard to recommend one (again, the Intel prices), when for 15 bucks more you can get an i5-7500 (locked quad-core CPU) that will crush it in most scenarios, and more future proof, while still being great in gaming.
True, I suppose you could call that high end. The benchmarks do show however that the gain with 4 cores in games is pretty small. If it were half the price it would probably be popular.

Quote:
Originally Posted by DucFreak View Post
True, I'm not sure there is any simulator currently using more than 4 threads.
With that said, the most recent versions of P3D, the latest betas of XPlane12, and AeroflyFS2, all seem to show that threads are starting to be used (how effectively/intense in future I have no idea). So a change and care for that is starting to appear.
I not sure flight sims are the best example, they are well known for their horrendous performance by using mostly early 00's code.

Quote:
Originally Posted by DucFreak View Post
There are plenty popular games taking use of more than 4 cores/4 threads that I came across.
For instances:
  • Batman: Arkham City, Arkham Origins, Arkham Knight
  • Battlefield: Bad Company 2, BF3, BF4, and the recent BF1 (WW1)
  • Call of Duty: Advanced Warfare and Infinite Warfare
  • Civilization: V and VI
  • Dragon Age Inquisition (and Mass Effect: Andromeda, to be launched this month)
  • Metro series: Metro 2033 and Metro Last Light
  • Titanfall 2
  • Witcher 3 (and the upcoming Ciberpunk 2077)
I'm sure there's many more.

The GPU is the breaking factor, agree with that for sure.
I couldn't find any benchmarks that show those games taking advantage of more than 4 cores. Sure, they may use them, but if the difference is negligible then it's mostly an on paper difference. BF1 showed a 15% increase 2 core vs 4 core, Titanfall 2 showed <1% difference 2 core vs 4 core, for example.

Quote:
Originally Posted by DucFreak View Post
But you can't put the meshes, physics, sounds and respective uncompressed files (much bigger today) in the GPU. The CPU, RAM and SSD/HDD are just as important.
The GPU should be able to hold all meshes pretty easily, and physics can be done on the GPU (but usually aren't), and the textures are usually compressed (DXTn). The sound card holds the sound buffers. Yeah, streaming textures can be an issue, but it seems they are catching up on VRAM.

Quote:
Originally Posted by DucFreak View Post
Detail is always increasing for all those areas, so it comes to a point where steady average framerate becomes complicated. Difficult even, if capturing video (common practice now).
And I think this is where multicore is going to prove more and more important.
True, it would help for recording. Like I mentioned earlier, encoding certainly benefits from more cores.

Quote:
Originally Posted by DucFreak View Post
The consoles hardware and OS work differently, but they've been able to handle such detailed games with poor hardware also because multithreading is fully used.
.
You really can't compare consoles and PCs. Knowing the hardware allows you to wring every ounce of performance out of it. You also have much finer control of the bare mental than a normal OS can provide.
MickeyMouse is offline   Reply With Quote
Unread 3 March 17, 18:50   #60
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by MickeyMouse View Post
I couldn't find any benchmarks that show those games taking advantage of more than 4 cores. Sure, they may use them, but if the difference is negligible then it's mostly an on paper difference. BF1 showed a 15% increase 2 core vs 4 core, Titanfall 2 showed <1% difference 2 core vs 4 core, for example.
The problem is, the average framerate in a benchmark final result does not show how smooth the gameplay is or not. Which, in my personal experience, can make or break the gaming experience.

I'll give a personal story with my current CPU to ilustrate this....

For a while I was interested enough ("greener pastures over there" and all that) to make an offer for an Intel i3 4170 (2 cores/4 threads, 3.7 Ghz) that a friend was selling.
I ended up having it for a good while in here, so I had lots of time to compare gaming performance with the AMD FX6300 (6 cores/6 threads, 3.5Ghz-4.1Ghz on turbo) that I'm still using, and overclocked since.
Used same GPU and RAM, as that would be the case if doing the deal.

From StrikeFighters 2, BOB2, IL-2, LOMAC and FC2, through Arma II, Homeworld Remastered, Mass Effect 2 and 3, to Skyrim and Witcher 2 and 3, etc, I think I tried most games I had installed at that time to clear my doubts.

Unsurprisingly, the Intel i3 gets better maximum framerates, but the minimum framerates in some performance intensive areas were also noticeably lower.
Sometimes stuttering and jerkiness would occur (very annoying, IMO), framerate flutuation galore.
Something that I did not get with my AMD FX6300. On this last one, framerate was slightly lower on average, yes, but the experience was definitely much, much smoother (preferable to me, not even close).

And I haven't got to the overall usage experience yet.
Audio production and photo/textures editing software all benefit greatly with the bigger number of physical and logical cores in the FX6300 (against the i3 4170), same for big compress/decompression of files (common here) and, seems clear to me now, gaming as well.

Again, benchmarks are necessary to get a rough idea, but they don't tell the whole story. Far from it.
.

Last edited by DucFreak; 3 March 17 at 19:25. Reason: spelling...(?)
DucFreak is offline   Reply With Quote
Unread 3 March 17, 19:32   #61
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

What you want to look for minimum FPS and FPS standard deviation. The benchmarks I've seen don't show a significant difference between the ratio of avg/min or standard deviation between 2 or 4 cores. Are you sure you didn't have some configuration or software issue?
MickeyMouse is offline   Reply With Quote
Unread 3 March 17, 19:48   #62
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by MickeyMouse View Post
What you want to look for minimum FPS and FPS standard deviation. The benchmarks I've seen don't show a significant difference between the ratio of avg/min or standard deviation between 2 or 4 cores. Are you sure you didn't have some configuration or software issue?
Yes, I'm sure.
It even had a better HDD than the one I was using back then for games (WD Black 1TB Sata3 VS Samsung HD502HJ Sata2) and both PCs were using similar PSUs (CX750 in it, CX600 in mine).
Both Win7 SP1, malware/spyware cleaned, antivirus off during gaming, etc.

Only two games were noticeably better with it, Flaming Cliffs 2 and ArmaII. All others played better with the FX6300.
I was very surprised because benchmarks in reviews and forums were showing it as capable to compete against an FX8350, let alone an FX6300 (and at stock clocks then).

Not sure an SSD would solve the ocasional stutters and jerkiness, seriously doubt the framerate flutuation would improve. Even if it did, that would mean bigger investment (contrary to the idea, afterall).
The fact is, with pratically same hardware and conditions, the FX6300 (stock then) was better than it. Surprisingly? Maybe, but it was so for me.

Our theory on the subject when I was around it was that, maybe the bigger number of threads make the workload spread better, therefore flattening spikes or bottlenecks (smoothing out)? I really don't know, only how they felt.
DucFreak is offline   Reply With Quote
Unread 3 March 17, 19:50   #63
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Looks like Ryzen is dead to me, I just noticed they are only PCIe2.0 lanes ?


rFactor2 gains too much from PCIe3.0 not to be used.
DurgeDriven is offline   Reply With Quote
Unread 3 March 17, 20:00   #64
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

@ DD: Seriously, you just got yourself that Intel i5 7600.
Why not keeping that for a year, year and half, and see what happens down the line?

And btw, AMD Ryzen is PCIe 3.0.

Quoting from http://www.gamersnexus.net/guides/27...x370-b350-a320 :
Quote:
PCIe Gen3 lanes for graphics are all on the CPU, with the currently known 8C/16T Ryzen chip hosting 16 PCI-e 3.0 lanes for the GPU.
These can be assigned to a single device or split in x8/x8 fashion (or multiplexed down to an x4 setup), but will be entirely reserved for the GPU.
Other devices on the PCI-e bus, like M.2 devices, will draw from general purpose or chipset lanes.
Four PCI-e Gen3 lanes are sent to the chipset to handle its I/O, and are then split amongst the connected devices.
DucFreak is offline   Reply With Quote
Unread 3 March 17, 20:01   #65
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

Quote:
Originally Posted by DurgeDriven View Post
Looks like Ryzen is dead to me, I just noticed they are only PCIe2.0 lanes ?
rFactor2 gains too much from PCIe3.0 not to be used.
That's interesting, there doesn't seem to be a big difference between pcie 2.0 and 3.0.
MickeyMouse is offline   Reply With Quote
Unread 3 March 17, 20:22   #66
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

Quote:
Originally Posted by DucFreak View Post
Our theory on the subject when I was around it was that, maybe the bigger number of threads make the workload spread better, therefore flattening spikes or bottlenecks (smoothing out)? I really don't know, only how they felt.
Each thread runs in a linear fashion, so it shouldn't matter. Most likely is something in the background stealing processor time. Or possibly a bad BIOS setting, bad driver, etc.
MickeyMouse is offline   Reply With Quote
Unread 3 March 17, 20:33   #67
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Back to the OT, and regarding differences noted in reviews of AMD Ryzen from different reviewers...

...this one got interesting (especially from 3:45 and on):
DucFreak is offline   Reply With Quote
Unread 3 March 17, 23:24   #68
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
Originally Posted by MickeyMouse View Post
That's interesting, there doesn't seem to be a big difference between pcie 2.0 and 3.0.
There is virtually none you right but in rFactor2 it is heaps

You can read old thread here

https://forum.studio-397.com/index.p....45758/page-16

Then you can read stuff like

https://linustechtips.com/main/topic...-in-rfactor-2/

From a bunch that don't even run rFactor2 !@! and without running a single tests to confirm

Know it alls..... lol

Would anyone have believed a Ryzen 1800X is slow @1080p Gaming ?


I can tell you I tested by Bios, stock, overclocked with 3 different boards 3 GPU and I SLi and 3 CPU and PCIe3.0 in rFactor2 is FASTER
I don't care what those dills on that other site say

They are saying everyone in the original thread that reported gains are lying not just DRipper

In the end my H97 and stock Haswell non K was faster with same GPU and Ram and 1100MHz less clockspeed then the Sandy !

When I tested pcie2.0 I lost 10-15% with various combos of cars and tracks.

I would do GTX1070 but I tired of recording stuff for people say fake tests ( omg ) I know it happens
DurgeDriven is offline   Reply With Quote
Unread 3 March 17, 23:42   #69
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
Would anyone have believed a Ryzen 1800X is slow @1080p Gaming ?
No. Simply because it isn't true.

You may have noticed in my previous post that I attached a video about the differences noted in reviews of AMD Ryzen from different reviewers.

This is getting quite some commotion because, in some reviews, there are differences in the order of ~20%, which is a lot. And is not normal.
From bad UEFI bios versions, to bad RAM memory and motherboards, premature tech, through apparent laziness and negligence, to some conspiracy theories of some being in payrolls (either of Intel or AMD), we've had it all it seems.

Anyway, where it gets interesting, is that there wasn't any "slow" result for AMD Ryzen 7, anywhere.
If you paid attention, all these processors, be it Intel Kabylake or Broadwell-E or AMD Ryzen, and despite the differences in the benchmarks, all perform great in 720P, 1080P, 1440P, and 4K gaming benchmarks. (as usual, in these last two the GPU is the limiting factor)


And, again, about the differences noted in reviews of AMD Ryzen from different reviewers, here's more wood for the fire.....

Finally, someone reviewed the cheapest AMD Ryzen 7 1700 (the "non-X" model, also overclockable).
Finally, someone who had a motherboard with the correct bios, correct memory speeds, and took time to overclock, then bench also with various games, for head-to-head results.

There you go....

Last edited by DucFreak; 4 March 17 at 00:31. Reason: added video
DucFreak is offline   Reply With Quote
Unread 4 March 17, 00:54   #70
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

That is good then
DurgeDriven is offline   Reply With Quote
Unread 4 March 17, 02:11   #71
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
That is good then
No, it isn't.
Your reaction change from displeased to pleased with the reviews/benchmarks just proves a point.

The issue is that the reviews are all over the place. A few are more or less matching the AMD Ryzen CPUs with the i7 7700K even at 720p, and others are horribly dumping the new chip to sub-standard.

Most of that is teething issues from the new architecture (clearly, looking at reports of RAM mem, BIOS and motherboard issues).
I guess it's worth remembering that the Intel Haswell, Broadwell and Skylake/Kabylake chips are all very mature at this point, sharing so many similarities between them, all of them on tech proven and established through the years.

Personally, I'm slowly becoming disapointed with the AMD Ryzen launch, but for completely different reasons other than pure performance related.

In a few months the performance numbers will almost certainly look better for AMD Ryzen, as newer specific RAM appears, as BIOS and Win10 drivers are improved, and probably even some games getting patched to work better with AMD Ryzen CPUs. But, the downside is, the possible scenario is one in which Win7 users will not see this CPU perform anywhere nearly as good as it will with Win10.
Meaning, unless the parts involved (or a third party) get a look to it and release something, getting an AMD Ryzen for Win7 users (like myself) is starting to look like a no-no. I hope I'm wrong.
.

Last edited by DucFreak; 4 March 17 at 02:30.
DucFreak is offline   Reply With Quote
Unread 4 March 17, 02:50   #72
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

I was just being optimistic the and is just for others lol

My opinion has never changed

I would buy 1400X IF I had not got a 7600K

What I meant if it just happened I could sell my CPU/Mobo ( like I did my Haswell ) in a system the lower priced Ryzen would recoup most of the loss

I never said the don't have PCI-e3.0
I said they only run 8x 8x in CF / SLi

Most tests show a 5% gain at 16x 16x over 8x 8x
That is what extreme users pay so much for 32-40 lanes CPU to run 2/3 or 4 cards faster

That is 5% average Intel gets back in some games it is 20%

Which was my point I don't care what games run multi core or how many lanes... all I care about is sims
and as far as I can tell Ryzen is not going to improve them and nothing will change with pcars2 or AC2 or iracing etc etc

Also what I meant is why say that about waiting for a 4 core.

Ryzen beats 6900K but a 7700K will beat a 6900K or better in single threaded gaming all day long.........1800X ( $699AU ) 7700K ( $485AU )

So for me it makes no sense at all and if a 1800X can't do it for me why will a 1400X ?

If I was worried about multi thread and encoding obviously it would
DurgeDriven is offline   Reply With Quote
Unread 4 March 17, 07:24   #73
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
Originally Posted by DucFreak View Post
In a few months the performance numbers will almost certainly look better for AMD Ryzen, as newer specific RAM appears, as BIOS and Win10 drivers are improved, and probably even some games getting patched to work better with AMD Ryzen CPUs. But, the downside is, the possible scenario is one in which Win7 users will not see this CPU perform anywhere nearly as good as it will with Win10.
Meaning, unless the parts involved (or a third party) get a look to it and release something, getting an AMD Ryzen for Win7 users (like myself) is starting to look like a no-no. I hope I'm wrong.
.

None of that is going to matter for existing sims and I don't think sim developers will spend money on optimization as they don't do it now for video cards ? lol

Only ones ever get nvidia bug fixes is codemasters

I could not give a rats bum for Ashes for Singularity


Point is some went out and pre-ordered a 1700 to replace their beloved aging 5GHz Sandy/Ivy and DDR3 only to find out it is no faster even slower in titles they like.......like sims for instance lool

DurgeDriven is offline   Reply With Quote
Unread 4 March 17, 19:16   #74
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
None of that is going to matter for existing sims and I don't think sim developers will spend money on optimization as they don't do it now for video cards ? lol

Only ones ever get nvidia bug fixes is codemasters

I could not give a rats bum for Ashes for Singularity
Yeah, well, there you go. That's why I pretty much quitted racing-sims three years or so ago.

The lack of vision and ambition from devs has been uninspiring, if not frightening. The newer gen is stale, and a lot worse than the previous gen. Good luck with support, btw.

And you (still) don't care for any other gaming genres of gaming other than sim-racing?
There's much, much greener pastures are out there for PC games, believe me.

Quote:
Originally Posted by DurgeDriven View Post
Point is some went out and pre-ordered a 1700 to replace their beloved aging 5GHz Sandy/Ivy and DDR3 only to find out it is no faster even slower in titles they like.......like sims for instance lool

*shrugs*

First, I'm sure you've heard before the term "early adopter of new tech". People who pull the trigger so prematurely know what they are in for. They are obviously accepting in advance any teeting issues from the new architecture, as well as unknown performances in very specific scenarios. Nothing new here.

Second, and for the umpteenth time, these AMD Ryzen processors were never marketed or considered to be faster than Intel for IPC/single-core tasks (like gaming), only that they would be competitive for a lot less money (and, in that, they are absolutely kicking arse). BIG difference.

If you thought differently, then you have not paid attention.

Quote:
Originally Posted by DurgeDriven View Post
I never said the don't have PCI-e3.0
I said they only run 8x 8x in CF / SLi

Most tests show a 5% gain at 16x 16x over 8x 8x
That is what extreme users pay so much for 32-40 lanes CPU to run 2/3 or 4 cards faster
Yes, you did say that. Quoting yourself:
Quote:
Originally Posted by DurgeDriven View Post
Looks like Ryzen is dead to me, I just noticed they are only PCIe2.0 lanes ?

rFactor2 gains too much from PCIe3.0 not to be used.
Now that PCI lanes are no longer a problem (AMD Ryzen is PCIe3.0), you pick another complaint.
The problem now is 8x8 for SLI/Crossfire... and gains for extreme builders(!?)...

An extreme user will always make an i7 7700K based rig for games, every single time, because that's the one that clocks higher, and overclocks best. An extreme user will have no budget limits, won't look to any expenses, nor care about laws of diminishing returns.
That's a completely different crowd to which AMD aims these new processors.

SLI/Crossfire works just fine in 8x8, difference is ~5%(!!) compared to 16x16.
In anycase, SLI/Crossfire is dying. The scaling with games is irratical (at best) and it's expensive.
The amount of people with multi-GPU setups is extremely small (in relation to general userbase) and will only become smaller, considering prices have increased with the new gen of GPUs introduced last year.

Lastly, RF2.... jeeebus
If that's all you care about, heck, just get a nice Intel i3 dual-core with high clocks for it (7300, for example), you will never need more than that for it.

BTW, that's probably the only game in history where PCIe3.0 shows noticeably difference to PCIe2.0.
Just shows how much of an oddball unoptimized POS that game was and still is.

Quote:
Originally Posted by DurgeDriven View Post
Also what I meant is why say that about waiting for a 4 core.

Ryzen beats 6900K but a 7700K will beat a 6900K or better in single threaded gaming all day long.........1800X ( $699AU ) 7700K ( $485AU )

So for me it makes no sense at all and if a 1800X can't do it for me why will a 1400X ?

If I was worried about multi thread and encoding obviously it would
Five little words.... M-O-N-E-Y

That's why you enter in hardware forums and see people saying they're reading every review and impression, and still holding their long awaited PC upgrade for the 6/12 and 4/8 cores/threads CPUs, the AMD Ryzen 5 processors (coming out in April-June 2017).

If you notice, all these Ryzen 7 processors, regardless of 1700, 1700X or 1800X, perform the same with games in general, at about i5 7600K levels or better (as in, perform very well).

The Ryzen 5 and Ryzen 3 chips will perform about the same as the Ryzen 7 chips with current games, but will be more affordable (see first post in this thread). That's why these are the ones to wait and go for, meant for the mainstream/budget arena.

If not making sense yet, then let's take an example with the rumoured Ryzen 5 1400X.
If the leaked info continues to turn out as true, then (I suspect) the most popular confrontation for next summer may be the "R5 1400X VS i5 7600K":

  • Games raw performance at about the same level in general (i.e, very good). Likely overall better with Ryzen 5 1400X.

  • Ryzen 5 1400X has 4 cores/8 threads, i5 7600K has 4 cores/4 threads.
    Multi-core programs (recording, streaming, encoding, rendering, files (de)compression, etc) will be faster with Ryzen 1400X.
    Possible higher gains with future games making use of multi-core tech with the Ryzen 5 1400X.


  • Both processors are overclockable, and relatively economical regarding energy consumption.

  • Tech in AMD Ryzen is new. Platform will be continuously supported and improved untill at least 2020.
    Tech in Intel Kabylake is older and mature. Platform support and improvements are unlikely to happen.


  • Ryzen 5 1400X ($199) is expected to be more affordable in at least $50 when compared to i5 7600K ($250).
    Motherboards and RAM have similar prices for both, if not cheaper prices with AMD Ryzen.

Think about it in the perspective of someone with generalistic needs (i.e, the vast majority) ready to upgrade an old system, or building a new one from ground up, and on a limited budget, in about three months from now. Do the math, weight pros and cons. I think it's a no-brainer.
.

Last edited by DucFreak; 5 March 17 at 04:50. Reason: spelling...(?)
DucFreak is offline   Reply With Quote
Unread 6 March 17, 00:37   #75
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default





In the end for my needs they will all do same job in sims @1920

By the time I can afford good screen and cockpit for online I will be upgrading CPU/GPU again.........who knows maybe 9th Gen Intel will be best.


This is why I till think 7600K vs 1700X compare from my perspective is fair enough.

I could not have waited months to buy a Ryzen 4 Core, I had already sold my H97/4690 so I would have been forced to get a 1700 at $120AU dearer then the 7600K I brought


http://hexus.net/tech/reviews/cpu/10...nm-zen/?page=6

7600K barely loses out there, sims would have a similar performance
Kabylake is $240AU cheaper 329 vs 569

But sure look here 7600K gets spat out and left to die. hehehe
http://hexus.net/tech/reviews/cpu/10...nm-zen/?page=3

But things like that do not bother me


I don't play games only drive sims .....and chessmaster, pacman 3dtetris
and 3dpool ....

I could do with faster encoding but not worth paying extra for my situation.

Personally I don't think the $300AU + I have paid for 3 i5 Intels is overpriced at all. What about when AMD were selling XPs here for $400/500 every 200MHz jump was $100 lool

In Australia 1400X will hardly end up cheaper then 7600K and that is before a Intel price drop ....IF they do
DurgeDriven is offline   Reply With Quote
Unread 19 March 17, 14:13   #76
syhlif32
Premium Member
 
Join Date: Dec 2013
Location: South of Brazil
Age: 59
Default

http://hothardware.com/news/microsof...ws-7-windows-8

Just a heads up on Ryzen and Kaby-lake cpu's
Might not be a big issue as the updates are likely not important anyway.

Just got a feeling that when enough people have jumped/ forced on the W10 bandwagon we will see subscriber fees.
syhlif32 is offline   Reply With Quote
Unread 19 March 17, 17:56   #77
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

Strange decision by Microsoft. You wonder who paid for that special 'feature'.
MickeyMouse is offline   Reply With Quote
Unread 19 March 17, 19:16   #78
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Strange decision by MS indeed.

I'm still convinced that, if true, ways around this crap will be found, but that doesn't mean that it's convenient or foolproof.

Being a Win7 SP1 user (with all the Win10 upgrades gimmicks cleaned off), perfectly happy with it and unwilling to change, maybe I should also consider an Intel Haswell or Skylake CPU for my next PC upgrade.
DucFreak is offline   Reply With Quote
Unread 19 March 17, 23:39   #79
syhlif32
Premium Member
 
Join Date: Dec 2013
Location: South of Brazil
Age: 59
Default

I too am a Win 7 SP1 user. Did try Win 10 and apart from a driver issue with some cams I have it worked well. Maybe even better than Win7!
The real turn off for me was the spying.
Not sure if the French ever got that sorted with MS?

That apart it can't be good news for AMD that it's new Ryzen cpus might have limitation with operating systems?
syhlif32 is offline   Reply With Quote
Unread 20 March 17, 01:58   #80
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

If you have W7 SP1 version is all you need

All the other updates won't make you any safer or sims run better, if anything worse.

W10 with as you say all the gimmicks removed is better then W7 , I always said that my problem was rF2 ran bad in W10 with micro stutter ( never solved it and went back to my W7 )

Now with Kaby + GTX1070 rF2 is lovely in W10 so I will never go back to W7 now.

Lets face it if you going to get hacked you going to get hacked no matter what you do so I would not worry about spying or other stuff too much

Before W10 billions have had their facebook and iphone paypal and the rest hacked so what is W7 going to do any better

you can streamline W10 the same way as W7

( I have nothing installed in windows 10 features except basic Netframe4.5 )
My W10 PRO is only 16.5GB with all tools I use added ( disable hibernation etc )

I turn off every single thing and remove everything from firewall

That is the biggest drama with W10 every time a feature or app whether it is installed or not when updated will re-enable the Firewall for it.......... so after updates you always check firewall again )

I use Chrome, VLC and Winamp all windows media crap is unavailable.

Biggest hassle with W10 is if you have a heap of gmotor sims folders you may need to redo all configs and exe with admin permissions.


Upgrade method:

DO a FULL IMAGE BACKUP including Reserved space and partition table of your W7

I recommend another drive myself and clone your W7 to it, you only need a 60GB SSD min.

Go HERE: https://www.microsoft.com/en-us/acce...ndows10upgrade

LET windows 10 install and validate windows you don't want updates....
yet

Now go here https://www.microsoft.com/en-us/soft...load/windows10

click on "Download tool now "

RUN MEDIA TOOL and download your W10 Pro to a 4/8GB thumb-drive


NOW run W10 setup from the thumb-drive for a full fresh install.


IMPORTANT !@!

ONLY FORMAT C and click NEXT

DON'T touch any other partitions


W10 will be activated as soon as you boot


P.S.

Seriously anyone including those using a hacked windows 7/8.1 should upgrade it is still free to "anyone" so grab it.

.........and you don't need to pay anyone anything for a key that may or may not remain legit !

Quote:
We have not announced an end date of the free upgrade offer for customers using assistive technology. We will make a public announcement prior to ending the offer.

Last edited by DurgeDriven; 20 March 17 at 03:25. Reason: P.S.
DurgeDriven is offline   Reply With Quote
Unread 20 March 17, 09:21   #81
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
If you have W7 SP1 version is all you need

All the other updates won't make you any safer or sims run better, if anything worse.

W10 with as you say all the gimmicks removed is better then W7 , I always said that my problem was rF2 ran bad in W10 with micro stutter ( never solved it and went back to my W7 )

Now with Kaby + GTX1070 rF2 is lovely in W10 so I will never go back to W7 now.

Lets face it if you going to get hacked you going to get hacked no matter what you do so I would not worry about spying or other stuff too much

Before W10 billions have had their facebook and iphone paypal and the rest hacked so what is W7 going to do any better

you can streamline W10 the same way as W7
Durge, I have no doubts that Win10 is the better choice as OS for those that are looking only to the future. I don't dislike the system per se.
The thing is, there are still problems with Win10 for users like myself, who also use many programs and games from way back, some from the WinXP 32-bit era and even older, which do not work with Win10 or, when they do, are utterly unstable or bugged (regardless of compability settings or admin rights check).
When with Win7 there are no problems, none whatsoever (in my own long experience with it).


Then there's the spying stuff, which is an effing huge can of worms...
Regarding this subject, I guess it does depend on individual and principles, how you see it.

As a funny side note, the movie 1984 is a great piece of fiction, but it's supposed to be also thought provoquing because it bases on the (real) premiss that every possible Big Brother controled totalitarian society starts from somewhere, for supposedly good and harmless reasons, for everyone, at any time. Because the majority of people are convinced to allow it to be.
How much do we allow? How much is too much? Is it going too far already? Should we accept it?

Personally, and by principal, I refuse to give in and accept the most blatant and wide open invasion of privacy ever since the invention of operating systems (btw, I don't give a rats arse about farcebook).

Win10 logs and communicates every mouse-click, keystroke, programs/games installed, even websites visited (browser history), supposedly even skype/msn contacts.
Talking with some IT fellas here, it's rumoured that it even has a back entrance for remote access to installed webcams and microphones. Not nice.
Yes, yes, it's not like Windows OSs have ever been inocent (none version has been), but it went too far, waaaay too far with Win10.

Over 18 months since the launch of Win10, some solutions can now cut many of these intrusive aspects with a good degree of sucess. The problem is, so far, none solution is 100% efficient, and noone really knows yet to what extent you can or not disable this abhorrent side of Win10. Still to many unknowns in an obscure side of an otherwise good OS, which taints any possible trust in it (IMO).
.

Last edited by DucFreak; 20 March 17 at 10:13. Reason: spelling...(?)
DucFreak is offline   Reply With Quote
Unread 20 March 17, 10:15   #82
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Understand


I dropped early sims with W7 as many never looked any good widescreen ie: F1 series and GTRacing2002 by simbin etc etc I still think RBR looks better 4:3 too
Stuff like say the original Driver works terrible on W7 in fact most DX6/7 games I can think of did ( I tested them all and stuck with XP well after W7 release for the same reason........ Vista was horrific I never used it

As for older software isn't it better running on a box from the period anyways

I think it is, most early games look and work better on a 19"- 22" CRT too


The other option is to run 2 OS on 2 HD and select at boot but that becomes tedious especially if you run UEFI and fastboot and need to swap bios settings.

Not the evil dual-boot way ...nono lol
DurgeDriven is offline   Reply With Quote
Unread 20 March 17, 10:55   #83
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by DurgeDriven View Post
As for older software isn't it better running on a box from the period anyways

I think it is, most early games look and work better on a 19"- 22" CRT too
To some extent I can agree, but there's a really positive upside for old games with modern hardware - noticeable performance improvements.

One example, the emulators for older consoles.
There are many games that I love from consoles that can go all the way to two decades back and over, so I have emulators for Sega Saturn and Dreamcast, Sony Playstation 1 and 2, Nintendo Gamecub and Wii. Some of these demand a fast, modern system. Some don't really like Win10, and prefer Win7.
All those emulators can run those games in the PC in improved ways that you could only dream off back in the day of those old consoles. FullHD, UltraHD, 8xAA, 16XAF and 60+fps, etc, even mouse and keyboard support.

Same can be applied to old PC games that ran like poo and now can run with unseen eye candy and mind boggling framerate (too many too list).
For example, I keep BoB2 WoV installed. It's an old combat flight sim with, surely, the best AI ever seen in a game of the genre, combats with 100+ airplanes on the sky (no other sim does this). It was originally pretty harsh on hardware, and now runs like a dream (and looks great) on Win7 with modern hardware. It runs extremely unstable on Win8/8.1/10 (if at all), with no solution so far for these.

If you have a system that can allow a peacefull and stable exhistence of older and newer programs (like Win7 does for me), perfectly fine, then why change that?
See my issue here when upgrade time comes?
.

Last edited by DucFreak; 20 March 17 at 11:07.
DucFreak is offline   Reply With Quote
Unread 21 March 17, 01:01   #84
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Ah okay emulators fair enough I was never into console games


But stuff like your Nascars and Indycars, GP1-4, Driver, RC2000, EA F1-2000 and up all ran better on old systems with Pentium/Athlons and DX6 to DX8 video cards and windows 95-98....
"Start Me Up" ( 95 theme song) lol

Go install F1-2002 on W7 widescreen (which everyone uses ) with late Nvidia and have a gander how it looks, shocking. All those never needed much framerate to be smooth, 30fps give or take.
DurgeDriven is offline   Reply With Quote
Unread 21 March 17, 03:33   #85
MickeyMouse
 
MickeyMouse's Avatar
 
Join Date: Jan 2007
Location: Ohio, USA
Age: 30
Default

Quote:
Originally Posted by DucFreak View Post
Win10 logs and communicates every mouse-click, keystroke, programs/games installed, even websites visited (browser history), supposedly even skype/msn contacts.
Talking with some IT fellas here, it's rumoured that it even has a back entrance for remote access to installed webcams and microphones. Not nice.
Yes, yes, it's not like Windows OSs have ever been inocent (none version has been), but it went too far, waaaay too far with Win10.
I think that's unlikely. I haven't seen any evidence of keyloggers or the like. If there were real evidence, it would make major news. It still logs too much, and they make it too hard to turn off, but it isn't quite that bad.
MickeyMouse is offline   Reply With Quote
Unread 21 March 17, 06:45   #86
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by MickeyMouse View Post
I think that's unlikely. I haven't seen any evidence of keyloggers or the like. If there were real evidence, it would make major news. It still logs too much, and they make it too hard to turn off, but it isn't quite that bad.
Oh, it's more than likely.
Try researching the subject around and you'll see that, it is actually a thing since Win10's day one. It was supposed to be just used on the preview but made it to the official Win10 release.
BTW, it seems there were updates for Win7 and Win8 that enabled the keylogging on them as well.
Not so inocent diagnostics and telemetry.

I'm not saying that people should behave like with hands in air yelling "the sky is falling" in crazy tinfoil hat fashion (not yet lol) but, really, I think everyone can agree that it's a load of BS having this kind of invasive crap working on the background of one's supposed "Personal Computer" (the irony!).
IMO, it's like spyware and malware.

You can accept it by ignoring it, or you can try to at least do something about it (on your own computer).

And for those wanting to do something about it, start by looking into the two following videos.
The style of videos from this guy is a bit of an acquired taste, but he's good with hardware/software and he worked at MS for over a decade:
.

Last edited by DucFreak; 21 March 17 at 08:21. Reason: added links
DucFreak is offline   Reply With Quote
Unread 21 March 17, 11:00   #87
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Meanwhile, and back to the original topic, a simulated preview of the Ryzen 3 and 5 series:

(the benchmark scores start at 3:59)
DucFreak is offline   Reply With Quote
Unread 28 April 17, 20:09   #88
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

The new Ryzen 5 1600 and 1600X (6 cores, 12 cores threads) seem to be particularly interesting for the price.

FWIW, posting a couple of review articles of the new AMD Ryzen 5 from two of my favorite hardware critics (technically well supported and presented), which also compare it directly against the Intel i5 and i7 counterparts.

http://www.anandtech.com/show/11244/...ads-vs-four/17

http://www.gamersnexus.net/hwreviews...rgument/page-4


The conclusion seems to be....

The argument for the Intel i5s is starting to fade. It's still there, for now, but fading.
The current juggernauts are, interestingly, the Intel i7 7700K and the AMD Ryzen 5 1600X (with an overclock).

If you’re purely gaming and not looking to buy in $300-plus territory, it’s looking like Ryzen 5 CPUs are close enough to Intel i5s to justify a purchase, if only because the frametimes are either equal or somewhat ahead.

On performance, for anyone wanting to do intense CPU work, the Ryzen 5 1600 and 1600X get the nod. Twelve threads are hard to miss at this price point.
Interestingly, buying a Ryzen 5 1600 or 1600X (+/- $250) and overclocking it would also, more or less, invalidate the more expensive Ryzen 7 purchases.

Going beyond 8 threads doesn’t do a whole lot for your gaming experience, but, as it has been shown, going beyond 4 threads does help in consistent frametimes. This is as important as framerate, for games to appear smoother. It's not like you can't have a good experience with 4 threads in most games, but that is the direction we’re moving. 16 threads won’t matter much anytime soon, but 8 will and does already.

The Intel i5 CPUs are still good and provide a decent experience but, for gaming, it’s starting to look like either you’re buying an i7 7700K (significantly ahead of all AMD Ryzen CPUs), or you’re buying an AMD Ryzen 5 CPU.

If you buy an AMD Ryzen 5, overclock it, and buy good memory, it’ll be competitive with Intel.
That said, be wary of spending so much on the platform and memory that you’re put into Intel i7 7700 territory, because, at that point, you’d be way better off with the Intel i7 for gaming.



PS: It seems that the AMD Ryzen 5 1600 (the non-X version) may be the pick of the bunch for the conscious buyer, as it includes a decent (if basic) cooler (the 1600X costs $30 more and comes with no cooler). The 1600 will also overclock nearly the same as the 1600X does (so, at about 4.0Ghz).
Also, the more affordable B350 motherboards will overclock the AMD Ryzen CPUs just fine, so long as you've rulled out multi-GPU (no SLI, no Crossfire - that is left for the X370 motherboards).

.

Last edited by DucFreak; 28 April 17 at 21:46.
DucFreak is offline   Reply With Quote
Unread 29 April 17, 05:09   #89
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
the i5 range is now obsolete

It's only advantage is for pure gaming
lool


........and for pure gaming I will take weaker CPU + Faster GPU any day of the week

I mean seriously just work it out will ya

7600K@4.2Ghz H Gaming board, stock ram, mid hydro cooling = ? $

7700K@4.8GHz Z Gaming board, fast ram, better hydro cooling = ? $

You know what the difference buys in GPU at least in Oz ?

GTX1070 Gaming 1 to a GTX1080 Extreme

That is using the same PSU and the i5 setup is still cheaper !

No overclocked i7 can gain that fps back with the slower GPU, it's facts

https://www.techpowerup.com/reviews/...dition/30.html
DurgeDriven is offline   Reply With Quote
Unread 29 April 17, 05:30   #90
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

I not trying to hijack thread Duc you know me by now

Ryzen is not really fair yet to compare cheaper non OC to fast OC builds as memory ranges and other stuff needs further tweaking.

Still would not surprise me if you could build 2 Ryzen for same price, one with GTX1080 Extreme and one with GTX1070 Gaming

I think we find the same thing in your graphs Duc the slower stock Ryzens are barely 10-15% slower then the fastest ..I mean when it comes to dual/quad core games like sims and you gain back 25%-30% with the faster GPU

They used EVGA GTX 1080 FTW1


EDIT:

Which build below do you think would yield more in games Duc ?

( I not saying they are best options of course and I know those prices will vary place to place )

Mind you the OC build has same ram to be fair and both can use the same 650-750W PSU

If you take off $145AU for the AIO which I now you don't care for and add say your favorite Cyborg ( $55AU ) they are basically the same price

Reviews say the Spire does a good job on 1700




Last edited by DurgeDriven; 29 April 17 at 10:30.
DurgeDriven is offline   Reply With Quote
Unread 29 April 17, 18:01   #91
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

@Durge:

IMHO, the Ryzen 7 (8 cores, 16 threads) based systems are a bad purchase for gaming purposes.
As said dozens of times in this thread, Ryzen 7 are not gaming-oriented processors, just like Intel Broadwell-E are not gaming-oriented processors. (...have you seen prices for those instead? LOL)

Yes, the R7 1700 is effectively the best purchase among the three Ryzen 7 processors. Even overclocks similarly to its most expensive brother.
But if gaming is the only concern, then, instead of that, get an Intel i7 7700 ("K" or not), which can be acquired for a similar price, and a LOT better for that specific purpose.
Or get the AMD R5 1600 ("X" or not) instead, for considerable less money.

- - -

Where it gets interesting (and why I posted those reviews here) is the conclusion that all these reviews of AMD Ryzen 5 get to.

For the last few years, the Intel i5s ("K" or not) have been the main choice for gamers. The champion of the "price/performance" balance. What most of us have been recommending or using.
But there is now a contender in the AMD Ryzen 5 (especially in the form of the R5 1600, "X" or not), which actually gets to be prefered and recommended after a comparison with the Intel i5s ("K" or not).

We know now that AMD Ryzen has good IPC (weak point with older AMD FXs), at about Intel i5/i7 levels.
We also know now that Intel i5 and AMD R5 are about the same in gaming, and also that both cost about the same when building a based system.

In a time when multi-threading is becoming more relevant (and will only become more and more so), be it for games or software/tools, there is no reason to invest in processor with 4 cores and 4 threads, when you can get one with not just 4 cores and 8 threads, but even 6 cores and 12 threads, for the same price.
One that performs (at least) as good with games (actually, gets better frametimes), and is objectively better in everything else.

The point to take is that noone should recommend the Intel i5s any longer when the AMD R5s have proved to be much more multi-purpose, future proof, and, effectively, the better processors.

A month after the somewhat clumsy introduction (issues with faster memory) the new AMD Ryzen platform got noticeable improvements, with updated motherboards and BIOS. And it'll only get better.
So, the next move is now on Intel. Now I hope they finally bring up something new, better/faster than AMD Ryzen 5 at same price (see the CoffeeLake rumours with the i5 hexacore).
.

Last edited by DucFreak; 29 April 17 at 19:20. Reason: spelling...(?)
DucFreak is offline   Reply With Quote
Unread 29 April 17, 23:20   #92
DurgeDriven
Premium Member
 
DurgeDriven's Avatar
 
Join Date: Jan 2011
Default

Quote:
Originally Posted by DucFreak View Post
The point to take is that noone should recommend the Intel i5s any longer when the AMD R5s have proved to be much more multi-purpose, future proof, and, effectively, the better processors.
Obviously anyone buying new yeah but future proofing for sims is bunkem to me

Drop a GTX1070 into a Sandybridge @5GHz and see how that goes


I only used the 1700 as a compare to even up prices on $1,800AU / $1350US budget

Anyways MY point is on a FIXED budget I can build a Stock i5 /1700 setup will beat a i7/ 1800OC in pure gaming for exactly the same money , run stock clocks, less power, less heat
DurgeDriven is offline   Reply With Quote
Unread 1 August 17, 22:20   #93
bob gnarley
 
bob gnarley's Avatar
 
Join Date: Oct 2008
Location: Noah vale
Default

Just to follow up I've had my Ryzen build up for about a month now.

NZXT S340 mid tower Black
Asus Prime Pro x370 m/b + latest bios
Ryzen 1600 (all cores/threads stable at 3.816ghz-1.2563v)
Corsair Vengance lpx (white) 3000@2966
Asus RX480 Dual 8gb(white)
Phanteks PH-TC14s 140mm cpu cooler + Thermal Grizzly Kryonaut tim
Asus Xonar AE sound card
Superflower Leadex 550w (white)

The only thing that I can compare it to is my old Phenom rig and obviously there is no comparison there. Absolutely no complaints performance wise.

Bios issues seem mainly ironed out except for maybe non QVL ram and those seeking high memory overclocks. I did an awful lot of lurking around the forums at Overclock.net and gained much useful information. My o/c was boringly straight forward and this board has a feature that allows the use of Intel's xmp memory profiles so the memory clock was one setting.

With all Ryzen, it seems that most will o/c up to around 3.8 on low voltages showing good performance gains. Above that they tend to need bigger lumps of voltage for lesser gains. Non X cpu's will, if you are lucky, o/c as well as any X chip but X cpu's are the better bet for high clocks. Don't expect too much though, 4.1-4.2ghz is golden. Non X cpu's seem to be happiest at 3.8 - 3.9ghz.

For anyone planning higher overclocks, try and stretch to an x370 board. They all have a couple of extra vrm's (voltage regulator module) over the B series boards which will help the life of your rig. People who have o/c'd 1700's to 4.1ghz on b series boards are complaining that "the vrm's are getting hot." lol.

The most important thing with Ryzen is to select memory from AMD's qvl to avoid issues. While Single sided Samsung B-die kits offer the best overlocking possibilities, 3600mhz and above if you can get it stable, Hynix kits like mine seem good to around 3200. Stock is 2400mhz which steps down to 2133. Some of the best performance gains with Ryzen come from memory overclocking, 2966-3200mhz seeming the sweet spot.

Overall, completely happy with the experience and would certainly recommend anyone consider Ryzen for their next build. For anyone not looking to overclock, an X series cpu with higher stock speeds paired with a B350 would be a great choice. I tried this set up at its stock 3.2ghz, core boost disabled, mem@2133mhz and its still ran RF2 & AC like a champ.

Also, as you should with any new motherboard, be sure to update to the latest bios. Especially true with Ryzen... but that's it really. Happy days!
bob gnarley is offline   Reply With Quote
Unread 2 August 17, 03:24   #94
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Very nice, BG! That overclock with R5 1600 @3.8Ghz with just 1.25~1.26 volts is fantastic (I see some people struggling to get there under 1.35V with same processor!). Good feedback too.

I've got an oportunity with an X58 mobo to fit a Xeon X5660 (old and much cheaper combo, to be OC'ed sky high, though used parts are a risk), but yours is the model of CPU and chipset that I've been targeting (the AMD Ryzen R5 1600 with B350 mobo), as it's maturing great and looks to be the best purchase right now (only R7 1700 and i7 7700K surpassing it but at higher price).
It seems that so long as you don't mind to fiddle with the settings (to OC it manually), the "non-X" models are just as good, as they (pretty much) OC the same and are more affordable than the "X" models.


PS: never used Thermal Grizzly Kryonaut and have been on the fence, looking at the compliments it gets.
Is it really that better than good old trusty thermal paste products like MX4, MX2 or AS5?
.

Last edited by DucFreak; 2 August 17 at 03:48.
DucFreak is offline   Reply With Quote
Unread 2 August 17, 19:38   #95
bob gnarley
 
bob gnarley's Avatar
 
Join Date: Oct 2008
Location: Noah vale
Default

Thanks man!

The Xeon, if it suits your needs and pocket, could be a great move for a couple of reasons.

As you know with AMD, whether the initial release is good or bad, the revision is usually much improved. By which I mean piledriver being a big step better than bulldozer, Phenom II better than...etc. If Ryzen is good, Ryzen 2 should be a great cpu. Certainly an interesting prospect and not too far away.

X299 wasn't Intel's response to Ryzen. We still haven't seen that yet. They have certainly taken a bruising lately and I would expect them to come out all guns blazing. Maybe worth keeping your options open.

Ok, "X" and none "X" cpu's. All Ryzen 5 & 7 cpu's are the same as in they are all manufactured as 1800X's. If then during the binning process one or more cores show signs of instability or defects these cores can be disabled making it a lower core count cpu. If there are signs of instability in the memory fabric between cpu's something here, can't remember what, can be disabled making it a non "X" cpu. I don't think there is a difference in reality though, don't remember anybody reporting so.

My feeling with a none "X" is that if you are unlucky you get a cpu that, in part at least, had inherent instabilities. It will run at stock no problem but stable o/c's of any kind may be a different matter. On the other hand, you may get lucky and get a cpu that simply has those features disabled. Manufacturers surely can't be turning out that many imperfect cpu's. The 1600 being the most popular in the range may be the best place for that 40 pound/euro/dollar gamble.

Regarding thermal paste, obviously "which is the best TIM?" is a subjective question which will draw a dozen different answers which is why I was happy to come across this comparison chart. (post #13)

http://www.overclock.net/t/1491876/w...#post_22646705

I was hovering between 60-61c with Prime95 when stressing on a warm summer afternoon. Amd recommended max 75c, above which they say degradation may occur. Its worth noting that the limit with Ryzen o/c's tend to be stability rather than thermal.
bob gnarley is offline   Reply With Quote
Unread 2 August 17, 23:12   #96
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

I guess that's the best plan possible for AMD, having a modular design, to where little faults/limitations on each individual chip in the production line also dictate what model it's going to be. Very clever, simpler and practical, with clear benefits in production costs. A bit like seen before with the Phenom I and II, where the dual and tri core chips were quad core chips with one or two cores disabled, usually due to small faults in production. Sometimes there was the lucky chip where you could (re)enable the cores on the lower chips with the right mobo/bios. Which makes me wonder if any motherboard manufacture is investigating this possibility with Ryzen. hehe

Thanks BG, bookmarked that forum thread at overclock forums. (lots of info in the whole thread to digest later)
DucFreak is offline   Reply With Quote
Unread 6 August 17, 19:48   #97
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

DigitalFoundry (very trusty) also confirms:

DucFreak is offline   Reply With Quote
Unread 6 August 17, 21:21   #98
bob gnarley
 
bob gnarley's Avatar
 
Join Date: Oct 2008
Location: Noah vale
Default

Thanks for the post DF!

Choice for the consumer is a wonderful thing. When AMD announced a couple of years ago that they were leaving the desktop market I thought that I would be priced out of the market. AMD have certainly livened things up by staying!

I would certainly say that the 1600 is a good gaming cpu. You can go faster with Intel, though probably not at this price point. I have seen it offered for as little as 180 here in the uk.

Where it really comes into its own is if you also use it for multicore applications or game streaming etc. A quick comparison test against a 7600k with cpu-z shows:

My cpu@3.8 single core score - 443.6
7600k@3.8 single core score - 480

My cpu multithread score - 3569.3
7600k multithread score - 1837

As I say it's a good gaming cpu but if you do more than game its awesome value. I also get the impression that it could handle a much better card than an RX480...

You don't really want that Xeon do you?

On a side note, you may have seen this video as its been around a couple of weeks now, but it's a must watch for anyone with an interest in the ongoing war between Intel and AMD or anyone involved in the pc marketplace over the last few years. Quite eye opening.

bob gnarley is offline   Reply With Quote
Unread 6 August 17, 21:59   #99
DucFreak
 
DucFreak's Avatar
 
Join Date: Mar 2006
Location: Lisboa, Portugal www.gtlw.co.uk
Age: 42
Default

Quote:
Originally Posted by bob gnarley View Post
As I say it's a good gaming cpu but if you do more than game its awesome value. I also get the impression that it could handle a much better card than an RX480...

You don't really want that Xeon do you?

On a side note, you may have seen this video as its been around a couple of weeks now, but it's a must watch for anyone with an interest in the ongoing war between Intel and AMD or anyone involved in the pc marketplace over the last few years. Quite eye opening.

https://www.youtube.com/watch?v=osSMJRyxG0k
If an Intel i5 7600K doesn't seem to bottleneck a GTX1080Ti (fastest consumer GPU atm), then an AMD Ryzen R5 1600 surely does not bottleneck any exhisting GPU either.

I love listening to that fella at AdoredTV (Scottish, I think?). He's articulated and tells it like it is in his videos, without being ludicrous or with dramas, and so are usually a very good view.
Thanks for the link, as I'm yet to see that one (should subscribe to the fella's channel).

On that Xeon I mentioned for X58 mobo, my situation is different than most...

To explain why that particular chip, it's because those Xeons (X5650, X5660, X5670, X5680) overclock really easy. All of them can easily reach 4.0 Ghz, and even up to ~4.9 Ghz with luck (and top cooling). They are now found in ebay very cheap. And I mean really cheap.

Note that these are 6 cores and 12 threads. If OC'ed, performance is comparable to an i7-3960X.
Ok, that's not Ryzen 1600 performance (which I'd prefer) but it's still a LOT of performance for the buck!

The only real problem is getting a motherboard to do that. It needs to be a top OC oriented X58 motherboard and, of course, must be one that also supports those Xeon server chips after a bios update. You won't find any below 200 euros (used!), and that's on ebay.

Have a look: http://www.overclock.net/t/1489955/o...-x58-xeon-club

Now, as I said, my situation is different than most.
I happened to recuperate a very good Asus X58 motherboard that was considered "bricked, dead" by a "trusty" PC shop in the area (rofl!), after a bad bios update from internet.
I simply bought a new dedicated bios chip for it (ebay businesses ftw!) and, hold and behold, with the replacement bios chip it's now working flawlessly again. LOL
The owner is a friend and already got a working replacement. He's considering letting this one go to me for a very fair deal. It's compatible with the Xeon 5650/5660/5670/X5680.
Coming September, I think I'll be getting one of those Xeons and make the overclock race happen.
.

Last edited by DucFreak; 6 August 17 at 22:47.
DucFreak is offline   Reply With Quote
Unread 6 August 17, 22:40   #100
bob gnarley
 
bob gnarley's Avatar
 
Join Date: Oct 2008
Location: Noah vale
Default

To be fair a few hardcore o/c'ers have become a little disillusioned with Ryzen. I had decided on my middle-of-the-road o/c long before and luckily had no issues reaching it. For some of those seeking bleeding edge performance, the experience isn't so smooth.

Intel still wins for o/c enthusiasts! Have fun!
bob gnarley is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT. The time now is 17:41.
Home - Top

Powered by vBulletin® - Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.

www.nogripracing.com 2003 - 2017
Page generated in 0.08250 seconds with 10 queries