Graphic Cards Review: Zotac GTX 1060 Mini (6 GB)

t3chg33k

Skilled
This purchase was never on my radar until the day I made the purchase. My existing GPU - GTX 660 was serving me well enough for the few hours I spent every weekend on gaming. It is true that I was using Medium to High details on recent games and perhaps not hitting 60 FPS, but it was not that distractingly visible as compared to the experiences one had more than a decade back with an underpowered GPU. However, the 660 made the decision for me when it simply burnt out at the stroke of midnight on Independence Day (the irony!). It was strange to see burn marks on the PCB along with broken capacitors, but I wasn't too perturbed because I guess I secretly did want to upgrade. I have been sheepishly keeping an eye on the AMD RX480 and the Nvidia GTX 1060 since their launches, unable to convince myself to jump the fence, so the (un)timely demise of the GTX 660 wasn't much of a shock, especially as I was at the sweet 3-generation gap between GPU purchases.

When it came to deciding between the RX480 and GTX 1060, I feel that the only thing the RX480 really had going for it was the price which isn't a factor at all in India due to some absurd pricing. The RX 480 does seem to have better DX12 Async performance as per current benchmarks but it has a much higher power consumption and certainly runs hotter even as AMD has apparently fixed some issues using drivers. This was important for me as I have been using the same Corsair 450W power supply for 8 years now and was in no mood to change it. The GTX 1060 in fact has a lower TDP than the GTX 660, so I was in fact reducing my power consumption while getting much higher performance. Another thing AMD has going for it is CrossFire support but I never have and never will get a dual GPU setup. It is simply unrealistic to do so at this price point. Keeping objectivity aside, I must admit that I feel Nvidia is more invested in PC after losing out to AMD in the console arena. The driver support is much better and features like Ansel and Simultaneous Multi-Projection technology indicate that they are totally invested in PC. Also, I have borne allegiance to Nvidia for over 17 years now starting with the Riva TNT2 M64 and then subsequently moving on to FX 5200, 9600 GT, GTX 660 and now to GTX 1060. So perhaps, the decision was already made even before I started to make it.

Coming to the GPU itself, the form factor is indeed small compared to the GTX 660 I discarded. It has a single fan with a direct GPU contact aluminium heatsink which on the face of it seems to be a downgrade compared to the dual fans and copper cooling pipes I had on the GTX 660, though looks can be deceiving. On the flip side, it meant much better spacing in my cabinet which I hope will afford better ventilation throughout. It might seem illogical to go for the mini-ATX form factor with an ATX cabinet, but to be honest, I have been on the lookout for the cheapest GTX 1060 I could find the Zotac Mini at about 3k less than the AMP! edition filled the bill perfectly. Also, the 2 + 3 years warranty on registration is simply phenomenal and I can certainly have some peace of mind knowing that I am covered should anything go wrong like it did with my GTX 660. The packaging is barebone and comes with literally nothing, so make sure you already possess any screws needed for installation. It runs on a single 6-pin power connector, so almost any decent power supply unit should have you covered. The card itself has 1x DVI, 1xHDMI 2.0b port and 3x Display Port 1.4.
GPU-Z.gif

Since the package doesn't come with a disc, you have to rely on GeForce Experience for the drivers until which Windows shows something generic like 'Microsoft Display Device'. Zotac makes its own GPU tool called Firestorm and I decided to give it a go before trying out something else like MSI Afterburner. Firestorm seems to have the tools needed for some basic tweaking but doesn't look to the most elegant. On idle, after installation, my GPU reported a temperature of 35 deg. Celsius with core clock speed of 139 Mhz and memory clock of 405 Mhz. The base (1506 Mhz), boost (1708 Mhz) and memory (2002 Mhz) frequencies are at reference values, so there is no overclocking out of the box, unlike the Amp! edition.
Idle.png

While I have no need to run benchmarks since I have nothing to compare against, I decided to give 3DMark a go to see how the combination of this GPU and my CPU (i5-3470) performs with respect to some review rigs running this GPU. The Graphics score of 13,123 for Firestorm - Performance compares favourably with the GTX 1060 Founder edition scores that can be found on the web. The Physics score of 6120 indicates that my CPU (i5-3470) may be the weak link in my setup. However, the combined score of 4592 is only about 2-3% lower than reviews with much beefier CPUs, so I wouldn't deem the CPU to be much of a bottleneck as far as gaming is concerned. Taking a quick peek at games, it was simply a pleasure to see the Geforce Experience optimisation turn up all the settings to 'Extra High' on MGSV: The Phantom Pain. I am one of those who has a huge backlog on Steam because of which I don't purchase any big ticket games on release, but there was no way I could resist not pre-ordering Deus Ex: Mankind Divided and am salivating at the prospect of maxing that one too using DX12.
3DMark_Firestorm.png

A key point of this card is the single fan setup, so it is imperative to keep an eye on the temperatures. I decided to stress test the GPU using FurMark to see how well it copes with the pressure. From an idle temperature of 35 deg. Celsius, the card hit 80 deg. shortly after 3 min, but thereafter it stayed stable at that temperature till the 5 minute mark when the fan was running at 72% of its max. speed. Interestingly, the core clock was constantly over 1800 MHz at full load which is higher than the stated boost frequency of 1708 MHz. A custom fan profile that bumps up the fan speed at higher temperatures ought to reduce the temperature compared to the "Auto" mode used by me and will mostly be essential on overclocking. As for me, I can't justify a need for overclocking at this moment considering that all my needs at 1080p are taken care of.
Stress%2BTest_5%2Bmin.png

As you can tell, I am extremely pleased with this purchase. I think Nvidia has hit the nail on the head with the release of the GTX 1060. It felt great to be able to purchase it just a month after its global release, something that was unheard of when I made my GPU purchases in the past. The price may be a bit all over the place at the moment depending on the seller, but I got mine for a shade less than 20.4k through a local seller on eBay, taking advantage of eBay's high value 12% discount coupon. Also, the premium paid with all the import duties isn't as obnoxious as it used to be in the past. Hence, I can heartily recommend this to anyone looking for a VR ready card that is going to max out absolutely anything at 1080p (and may be at 1440p) for years to come.
 
What are your load temps during regular gaming?
80c is very high but I am guessing that's due to Furmark.

I was curious about the mini 1060s since they fit so nicely and neat, but there are no reviews out there which provide temperature readings in Indian climate.

In the end I went with the MSI Gaming X 1060.
Bulky design but hasn't gone above 60c yet at 0 rpm (fans don't spin unless you cross 60c).

The EVGA models reportedly provide excellent cooling even with the small single fan form factor but the supply is extremely low atm.
 
I am gaming at 1080p, but I seem to have hit an unexpected bottleneck. All the while I was thinking about whether the CPU would be the limiting factor when getting the GPU but instead it turns out that 8GB RAM just isn't enough for the highest settings in modern games. Both, Deus Ex: MD and MGS V: TPP crash on loading of a level with a "low system memory" message using the optimal settings from Geforce Experience. Pushing down the texture quality a couple of notches gets me a bit further before crashing. Hence, I have ordered 2x4GB sticks to augment my RAM and should really be able to push it later this week.

For what it's worth, TimeSpy and Firestrike from 3DMark pushed the GPU to a temperature of 50-55 deg. Presently, I am finally completing the 2 episodes from HL2 (been resisting since eternity) and that barely pushes the GPU to 35 deg. from an idle temp of 31 deg. with 8xMSAA and 16xAnistropic.
 
CPU is never a bottleneck anymore as long as you have an i5 2500 or similar.

Faster CPUs are still required for heavy editing and encoding.
But games have hit a wall as far as processing power is concerned since it's all about pixel crunching now (AI and physics haven't evolved at all).


There's a 3FPS difference between an i5 2500k and an i7 6700k:
Cost of i5 2500k in 2012: 12000/-
Cost of i7 6700k in 2016: 26000/-

P6x5hnC.png
 
Last edited:
@saumilsingh You are right on the mark about that. Anyway, I received my GSkill Ripjaws X kit today, so will run the DX: MD benchmark at night to see how it performs with my i5-3470 (@3.6 Ghz) at Very High (DX11) for comparison. Will also log the temperatures and post them to address that part of the equation.
 
Strange that you received Out Of Memory errors with 8GB. Did you try changing settings manually? Geforce Optimal thing isn't very 'optimal' at all and you'd be better off tweaking manually.

I too am waiting for a G.Skill Ripjaws X 8GB kit and now you have me worried that it won't be enough.
Although my friend is playing DX:MD just fine with 8GB, he reported no stutters or anything.
 
I ran the Benchmark with the Very High preset and HWInfo showed a maximum memory usage of about 6 GB out of the 13 GB available (Windows takes up the other 3). So, in theory, the usage may go up to 9 GB in total. Having said that, I think my Windows RAM consumption is on the higher side. But 8 GB is still marginal, being as it is the minimum requirement for DX:MD.

As for the benchmark, it went as follows:
Very High preset (DX11): AVG: 45.1; MIN: 37.4; MAX: 57.4
Geforce Experience Optimal (DX11): AVG: 39; MIN: 30; MAX: 50
I am simply chuffed at getting about 80% of the performance of a GTX 1080 in the benchmark partnered with a i5-3470. Also, the Geforce Experience Optimal settings are on the higher side as it is a mix of Very High/Ultra and it simply seems too optimal when you consider the minimum FPS exactly hit 30.

As for the temperatures, the card hit a maximum of 69 degree during the benchmark with the fan at 61%. Idle temperature was 34 deg. and the average was 49 deg.
 
@t3chg33k offtopic but how's Dx: MD ? There were reports that PC port is full of bugs so If Its running fine, I will pick it up in two days when my FUP resets.


Yea not so great performance I hear but that is understandable as the game really pushes the limits. However the game itself is good - the previous one in the series was spectacular. I'm an old DX fan so I'm picking this up over the weekend as well.

Also the PC Porting is done by Nixxes - one of the better porting places out there (Tomb Raider ports, DX Human Revolution) there so I expect things to be sorted out sooner or later. Also a DX12 port of the is in the works and will be released early september.
 
I hope they don't repeat history and actually make the performance significantly worse as time passes.

The original HR runs smooth as silk, while the Director's Cut stutters everytime you change zones, reload a savegame or re-enter a previously visited zone.

Unplayable basically.
And to make things worse, Steam no longer sells the original version anymore.
 
I hope they don't repeat history and actually make the performance significantly worse as time passes.

The original HR runs smooth as silk, while the Director's Cut stutters everytime you change zones, reload a savegame or re-enter a previously visited zone.

Unplayable basically.
And to make things worse, Steam no longer sells the original version anymore.

I never found the Director's cut to stutter on my GTX 660 or else I was too engrossed to notice. :p

Edit: Just for the sake of curiosity, Ultra preset benchmark are as follows, still more than playable though I am mostly going to settle with something of my own.
AVG: 37.1 MIN: 28.8 MAX: 46.9
 
Last edited:
Just for the sake of curiosity, Ultra preset benchmark are as follows, still more than playable though I am mostly going to settle with something of my own.
AVG: 37.1 MIN: 28.8 MAX: 46.9
Playable, but that's not enjoyable by any standard.
Would you mind running the benchmark at 1600 x 900 High with Temporal AA?

My tolerance for variable framerates is very low. I find 'unhinged' framerates choppy so anything other than 60fps @ 60hz V-sync'ed doesn't feel fluid (even 100fps @ 60hz feels choppy to me).
To that effect, I don't mind dropping to 1600x900 to maintain 60@60.

Really want to try a G-Sync display in person...
 
Last edited:
Playable, but that's not enjoyable by any standard.
Would you mind running the benchmark at 1600 x 900 High with Temporal AA?

My tolerance for variable framerates is very low. I find 'unhinged' framerates choppy so anything other than 60fps @ 60hz V-sync'ed doesn't feel fluid (even 100fps @ 60hz feels choppy to me).
To that effect, I don't mind dropping to 1600x900 to maintain 60@60.

Really want to try a G-Sync display in person...

Here are the 1600x900 results, still quite some way away from a minimum of 60 FPS. Temporal AA is enabled for High and above presets. MSAA is simply unthinkable on this card. May be you are better off waiting for DX12 support to arrive next week.
AVG: 58.9 MIN: 49.8 MAX: 61.0
 
Behold! The astounding jump in performance with the DX12 preview.
Very High preset (DX12): AVG: 44.8; MIN: 33.3; MAX: 57.1

In case you haven't been following this thread, the earlier DX11 results were as follows:
Very High preset (DX11): AVG: 45.1; MIN: 37.4; MAX: 57.4

The patch notes warn about regressive performance on very high end cards. Until now, I just didn't know I had one. Next stop, September 19 for the full DX12 release.
 
Last edited:
DX12 has been very disappointing.
Nobody was expecting graphical improvements as this time around it was supposed to be all about the performance being a close-to-the-metal API, but so far it hasn't delivered.
 
Back
Top