The GeForce 10 series is a series of graphics processing units developed by Nvidia, initially based on the Pascal microarchitecture announced in March 2014. This design series succeeded the GeForce 900 series, and is succeeded by the GeForce 16 series and GeForce 20 series using the Turing microarchitecture.
Release date | May 27, 2016; 8 years ago |
---|---|
Manufactured by | TSMC Samsung |
Designed by | Nvidia |
Marketed by | Nvidia |
Codename | GP10x |
Architecture | |
Models | GeForce GTX series |
Transistors |
|
Fabrication process | |
Cards | |
Entry-level |
|
Mid-range |
|
High-end |
|
Enthusiast |
|
API support | |
DirectX | Direct3D 12.0 (feature level 12_1) Shader Model 6.7 |
OpenCL | OpenCL 3.0[1][a] |
OpenGL | OpenGL 4.6[2] |
Vulkan | |
History | |
Predecessor | GeForce 900 series |
Successor | |
Support status | |
Supported |
Architecture
The Pascal microarchitecture, named after Blaise Pascal, was announced in March 2014 as a successor to the Maxwell microarchitecture.[4] The first graphics cards from the series, the GeForce GTX 1080 and 1070, were announced on May 6, 2016, and were released several weeks later on May 27 and June 10, respectively. The architecture incorporates either 16 nm FinFET (TSMC) or 14 nm FinFET (Samsung) technologies. Initially, chips were only produced in TSMC's 16 nm process, but later chips were made with Samsung's newer 14 nm process (GP107, GP108).[5]
New Features in GP10x:
- CUDA Compute Capability 6.0 (GP100 only), 6.1 (GP102, GP104, GP106, GP107, GP108)
- DisplayPort 1.4 (No DSC)
- HDMI 2.0b
- Fourth generation Delta Color Compression
- PureVideo Feature Set H hardware video decoding HEVC Main10 (10 bit), Main12 (12 bit) & VP9 hardware decoding (GM200 & GM204 did not support HEVC Main10/Main12 & VP9 hardware decoding)[6]
- HDCP 2.2 support for 4K DRM protected content playback & streaming (Maxwell GM200 & GM204 lack HDCP 2.2 support, GM206 supports HDCP 2.2)[7]
- NVENC HEVC Main10 10 bit hardware encoding (except GP108 which doesn't support NVENC[8])
- GPU Boost 3.0
- Simultaneous Multi-Projection
- HB SLI Bridge Technology
- New memory controller with GDDR5X & GDDR5 support (GP102, GP104, GP106)[9]
- Dynamic load balancing scheduling system. This allows the scheduler to dynamically adjust the amount of the GPU assigned to multiple tasks, ensuring that the GPU remains saturated with work except when there is no more work that can safely be distributed. Nvidia therefore has safely enabled asynchronous compute in Pascal's driver.[10]
- Instruction-level preemption. In graphics tasks, the driver restricts this to pixel-level preemption because pixel tasks typically finish quickly and the overhead costs of doing pixel-level preemption are much lower than performing instruction-level preemption. Compute tasks get either thread-level or instruction-level preemption. Instruction-level preemption is useful because compute tasks can take long times to finish and there are no guarantees on when a compute task finishes, so the driver enables the very expensive instruction-level preemption for these tasks.[11]
- Triple buffering implemented in the driver level. Nvidia calls this "Fast Sync". This has the GPU maintain three frame buffers per monitor. This results in the GPU continuously rendering frames, and the most recently completely rendered frame is sent to a monitor each time it needs one. This removes the initial delay that double buffering with vsync causes and disallows tearing. The costs are that more memory is consumed for the buffers and that the GPU will consume power drawing frames that might be wasted because two or more frames could possibly be drawn between the time a monitor is sent a frame and the time the same monitor needs to be sent another frame. In this case, the latest frame is picked, causing frames drawn after the previously displayed frame but before the frame that is picked to be completely wasted.[12] This feature has been backported to Maxwell-based GPUs in driver version 372.70.[13]
Nvidia has announced that the Pascal GP100 GPU will feature four High Bandwidth Memory stacks, allowing a total of 16 GB HBM2 on the highest-end models,[14] 16 nm technology,[5] Unified Memory and NVLink.[15]
Starting with Windows 10 version 2004, support has been added for using a hardware graphics scheduler to reduce latency and improve performance, which requires a driver level of WDDM 2.7.
Products
Founders Edition
Announcing the GeForce 10 series products, Nvidia introduced Founders Edition graphics card versions of the GTX 1060, 1070, 1070 Ti, 1080 and 1080 Ti. These are what were previously known as reference cards, i.e. which were designed and built by Nvidia and not by its authorized board partners. These cards were used as reference to measure performance of partner cards. The Founders Edition cards have a die cast machine-finished aluminum body with a single radial fan and a vapor chamber cooling (1070 Ti, 1080, 1080 Ti only[16]), an upgraded power supply and a new low profile backplate (1070, 1070 Ti, 1080, 1080 Ti only).[17] Nvidia also released a limited supply of Founders Edition cards for the GTX 1060 that were only available directly from Nvidia's website.[18] Founders Edition cards prices (with the exception of the GTX 1070 Ti and 1080 Ti) are greater than MSRP of partners cards; however, some partners' cards, incorporating a complex design, with liquid or hybrid cooling may cost more than Founders Edition.
- An Inno3D GeForce GTX 1050 Twin X2
- A GeForce GTX 1080 Founders Edition in a computer
Reintroduction of older cards
Due to production problems surrounding the RTX 30-series cards and a general shortage of graphics cards due to production issues caused by the ongoing COVID-19 pandemic, which led to a global shortage of semiconductor chips, and general demand for graphics cards increasing due to an increase in cryptocurrency mining, the GTX 1050 Ti, alongside the RTX 2060 and its Super counterpart,[19] was brought back into production in 2021.[20][21]
In addition, Nvidia quietly released the GeForce GT 1010 in January 2021.[22]
GeForce 10 (10xx) series for desktops
- Supported display standards are: DP 1.3/1.4, HDMI 2.0b, dual link DVI[b][23]
- Supported APIs are: Direct3D 12 (feature level 12_1), OpenGL 4.6, OpenCL 3.0 and Vulkan 1.3
Model | Launch | Code name(s) | Fab (nm) | Transistors (billion) | Die size (mm2) | Bus interface |
Core config[c] |
SM count[d] |
L2 cache (KB) |
Clock speeds[e] | Fillrate[f][g] | Memory[e] | Processing power (GFLOPS)[h] | TDP (watts) |
SLI HB support[i] |
Launch MSRP (USD) | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Base core clock (MHz) |
Boost core clock (MHz) |
Memory (MT/s) |
Pixel (GP/s) |
Texture (GT/s) |
Size (GB) |
Bandwidth (GB/s) |
Type | Bus width (bit) |
Single precision (boost) |
Double precision (boost) |
Half precision (boost)[24] |
Standard | Founders Edition | ||||||||||||
GeForce GT 1010 (DDR4)[j][25][26][27] |
Jun 7, 2022 | GP108-200-A1 | 14 | 1.8 | 74 | PCIe 3.0 ×4 |
256:16:16 | 2 | 256 | 1151 | 1379 | 2100 | 18.42 | 18.42 | 2 | 16.8 | DDR4 | 64 | 589.3 (706.1) |
24.56 (29.42) |
— | 20 | No | ? | — |
GeForce GT 1010[j][22][28] |
Jan 13, 2021 | GP108-200-A1 | 1228 | 1468 | 5000 | 23.49 | 23.49 | 40.1 | GDDR5 | 629 (752) |
26.2 (31.3) |
30 | $70[29] | ||||||||||||
GeForce GT 1030 (DDR4)[j][30][31] |
Mar 12, 2018 | GP108-310-A1 | 384:24:16 | 3 | 512 | 1151 | 1379 | 2100 | 18.41 | 27.6 | 16.8 | DDR4 | 883 (1059) |
27 (33) |
13 (16) |
20 | $80[32] | ||||||||
GeForce GT 1030[j][30][33] |
May 17, 2017 | GP108-300-A1 | 1227 | 1468 | 6000 | 19.6 | 29.4 | 48 | GDDR5 | 942 (1127) |
29 (35) |
15 (18) |
30 | ||||||||||||
GeForce GTX 1050 (2GB)[34][35] |
Oct 25, 2016 | GP107-300-A1 | 3.3 | 132 | PCIe 3.0 ×16 |
640:40:32 | 5 | 1024 | 1354 | 1455 | 7000 | 43.3 | 54.2 | 112 | 128 | 1733 (1862) |
54 (58) |
27 (29) |
75 | $109 | |||||
GeForce GTX 1050 (3GB)[36] |
May 21, 2018 | GP107-301-A1 | 768:48:24 | 6 | 768 | 1392 | 1518 | 33.4 | 66.8 | 3 | 84 | 96 | 2138 (2332) |
66 (72) |
33 (36) |
? | |||||||||
GeForce GTX 1050 Ti[34][37][38] |
Oct 25, 2016 | GP107-400-A1 | 768:48:32 | 1024 | 1290 | 1392 | 41.3 | 61.9 | 4 | 112 | 128 | 1981 (2138) |
62 (67) |
31 (33) |
$139 | ||||||||||
GeForce GTX 1060 (3GB)[39][40] |
Aug 18, 2016 | GP106-300-A1 | 16 | 4.4 | 200 | 1152:72:48 | 9 | 1536 | 1506 | 1708 | 8000 | 72.3 | 108.4 | 3 | 192 | 192 | 3470 (3935) |
108 (123) |
54 (61) |
120 | $199 | ||||
GeForce GTX 1060 (5GB)[41][42] |
Dec 26, 2017 (Only available in China) |
GP106-350-K3-A1 | 1280:80:40 | 10 | 1280 | 8000 | 60.2 | 120.5 | 5 | 160 | 160 | 3855 (4372) |
120 (137) |
60 (68) |
OEM | ||||||||||
GeForce GTX 1060[39][43][44] |
Jul 19, 2016 | GP106-400-A1 GP106-410-A1 |
1280:80:48 | 1536 | 8000 9000 |
72.3 | 6 | 192 216 |
192 | $249 | $299 | ||||||||||||||
GeForce GTX 1060 (GDDR5X)[45] |
Oct 18, 2018 | GP104-150-KA-A1 | 7.2 | 314 | 8000 | 192 | GDDR5X | — | — | ||||||||||||||||
GeForce GTX 1070[46][47] |
Jun 10, 2016 | GP104-200-A1 | 1920:120:64 | 15 | 2048 | 1683 | 96.4[k][48] | 180.7 | 8 | 256 | GDDR5 | 256 | 5783 (6463) |
181 (202) |
90 (101) |
150 | 2-way SLI HB[49] or 2/3/4-way SLI[50] |
$379 | $449 | ||||||
GeForce GTX 1070 Ti[51] |
Nov 2, 2017 | GP104-300-A1 | 2432:152:64 | 19 | 1607 | 102.8 | 244.3 | 7816 (8186) |
244 (256) |
122 (128) |
180 | $449 | |||||||||||||
GeForce GTX 1080[23][52][53] |
May 27, 2016 | GP104-400-A1 GP104-410-A1 |
2560:160:64 | 20 | 1733 | 10000 11000 |
257.1 | 320 352 |
GDDR5X | 8228 (8873) |
257 (277) |
128 (139) |
$599 | $699 | |||||||||||
GeForce GTX 1080 Ti[54] |
Mar 10, 2017 | GP102-350-K1-A1 | 12 | 471 | 3584:224:88 | 28 | 2816 | 1480 | 1582 | 11000 | 130.2 | 331.5 | 11 | 484 | 352 | 10609 (11340) |
332 (354) |
166 (177) |
250 | $699 | |||||
Nvidia Titan X[55][56] |
Aug 2, 2016 | GP102-400-A1 | 3584:224:96 | 3072 | 1417 | 1531 | 10000 | 136 | 317.4 | 12 | 480 | 384 | 10157 (10974) |
317 (343) |
159 (171) |
— | $1200 | ||||||||
Nvidia Titan Xp[57][58] |
Apr 6, 2017 | GP102-450-A1 | 3840:240:96 | 30 | 1405 | 1582 | 11410 | 135 | 337.2 | 547.7 | 10790 (12150) |
337 (380) |
169 (190) | ||||||||||||
Model | Launch | Code name(s) | Fab (nm) | Transistors (billion) | Die size (mm2) | Bus interface |
Core config[c] |
SM count[d] |
L2 cache (KB) |
Clock speeds[e] | Fillrate[f][g] | Memory | Processing power (GFLOPS)[h] | TDP (watts) |
SLI HB support[i] |
Launch MSRP (USD) | |||||||||
Base core clock (MHz) |
Boost core clock (MHz) |
Memory (MT/s) |
Pixel (GP/s) |
Texture (GT/s) |
Size (GB) |
Bandwidth (GB/s) |
Type | Bus width (bit) |
Single precision (boost) |
Double precision (boost) |
Half precision (boost)[59] |
Standard | Founders Edition |
- Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.
- For calculating the processing power, see the Performance subsection of the Pascal architecture article.
GeForce 10 (10xx) series for notebooks
The biggest highlight to this line of notebook GPUs is the implementation of configured specifications close to (for the GTX 1060–1080) and exceeding (for the GTX 1050/1050 Ti) that of their desktop counterparts, as opposed to having "cut-down" specifications in previous generations. As a result, the "M" suffix is completely removed from the model's naming schemes, denoting these notebook GPUs to possess similar performance to those made for desktop PCs, including the ability to overclock their core frequencies by the user, something not possible with previous generations of notebook GPUs. This was made possible by having lower Thermal Design Power (TDP) ratings as compared to their desktop equivalents, making these desktop-level GPUs thermally feasible to be implemented into OEM notebook chassis with improved thermal dissipation designs, and, as such, are only available through the OEMs. In addition, the entire line of GTX Notebook GPUs also are available in lower-TDP and quieter variations called the "Max-Q Design", specifically made for ultra-thin gaming systems in conjunction with OEM Partners that incorporate enhanced heat dissipation mechanisms with lower operating noise volumes, which are also made available as an additional more powerful option to existing gaming notebooks as well, which was launched on 27 June 2017.
In addition, the GT series line of Notebook GPUs is no longer introduced starting from this generation, replaced by the MX series of Notebook GPUs. Only the MX150 is based on Pascal's GP108 die used on the GT1030 for Desktops, with higher clock frequencies compared to its Desktop counterpart, while the other chips in the MX series were re-branded versions of the previous generation GPUs (MX130 is a re-branded GT940MX GPU while MX110 is a re-branded GT920MX GPU).[citation needed]
- Supported APIs are: Direct3D 12 (feature level 12_1 or 11_0 on MX110 and MX130), OpenGL 4.6, OpenCL 3.0 and Vulkan 1.3
- Only GTX 1070 and GTX 1080 have SLI support.
Model | Launch | Code name(s) |
Fab (nm) | Transistors (billion) | Die size (mm2) | Bus interface |
Core config[a] |
SM Count[b] |
L2 cache (MB) |
Clock speeds | Fillrate[c][d] | Memory | Processing power (GFLOPS)[e] | TDP (watts) | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Base core clock (MHz) |
Boost core clock (MHz) |
Memory (MT/s) |
Pixel (GP/s) |
Texture (GT/s) |
Size (GB) |
Bandwidth (GB/s) |
Type | Bus width (bit) |
Single precision (Boost) |
Double precision (Boost) |
Half precision (Boost) | |||||||||||
GeForce MX110[f][60][61] |
Nov 17, 2017 | GM108 (N16V-GMR1) |
28 | ? | ? | PCIe 3.0 ×4 |
384:24:8 | 3 | 1.0 | 965 | 993 | 1800 (DDR3) 5000 (GDDR5) |
7.944 | 23.83 | 2 | 14.4 (DDR3) 40.1 (GDDR5) |
DDR3 GDDR5 |
64 | 741.1 (762.6) |
23.16 (23.83) |
— | 30 |
GeForce MX130[f][62][63] |
GM108 (N16S-GTR) |
1122 | 1242 | 9.936 | 29.81 | 861.7 (953.9) |
26.93 (29.81) | |||||||||||||||
GeForce MX150[f][64][65][66] |
May 17, 2017 | GP108 (N17S-LG) |
14 | 1.8 | 74 | 384:24:16 | 0.5 | 937 | 1038 | 5000 | 14.99 | 22.49 | 2 4 |
40.1 | GDDR5 | 719.6 (797.2) |
22.49 (24.91) |
11.24 (12.45) |
10 | |||
GP108 (N17S-G1) |
1468 | 1532 | 6000 | 23.49 | 35.23 | 48 | 1127 (1177) |
35.23 (36.77) |
17.62 (18.38) |
25 | ||||||||||||
GeForce GTX 1050 Max-Q (Notebook)[67][68] |
Jan 3, 2018 | GP107 (N17P-G0) |
3.3 | 132 | PCIe 3.0 ×16 |
640:40:16 | 5 | 1.0 | 999–1189 | 1139–1328 | 7000 | 19.02 | 47.56 | 112 | 128 | 1278–1521 (1457–1699) |
39.96–47.56 (45.56–53.12) |
19.98–23.78 (22.78–26.56) |
34-40 | |||
GeForce GTX 1050 (Notebook)[67][68] |
Jan 3, 2017 | 1354 | 1493 | 21.66 | 54.16 | 1733 (1911) |
54.16 (59.72) |
27.08 (29.86) |
53 | |||||||||||||
GeForce GTX 1050 Ti Max-Q (Notebook)[67][69] |
Jan 3, 2018 | GP107 (N17P-G1) |
768:48:32 | 6 | 1151–1290 | 1290–1417 | 41.28 | 61.92 | 4 | 1767–1981 (1981–2176) |
55.24–61.92 (61.92–68.02) |
27.62–30.96 (30.96–34.01) |
40-46 | |||||||||
GeForce GTX 1050 Ti (Notebook)[67][69] |
Jan 3, 2017 | 1493 | 1620 | 47.78 | 71.66 | 2293 (2488) |
71.66 (77.76) |
35.83 (38.88) |
64 | |||||||||||||
GeForce GTX 1060 Max-Q (Notebook)[67][70] |
Jun 27, 2017 | GP106 (N17E-G1) |
16 | 4.4 | 200 | 1280:80:48 | 10 | 1.5 | 1063–1265 | 1341–1480 | 8000 | 60.72 | 101.2 | 3 6 |
192 | 192 | 2721–3238 (3432–3788) |
85.04–101.2 (107.3–118.4) |
42.52–50.60 (53.64–59.20) |
60-70 | ||
GeForce GTX 1060 (Notebook)[67][70] |
Aug 16, 2016 | 1404 | 1670 | 67.39 | 112.3 | 3594 (4275) |
112.3 (133.6) |
56.16 (66.80) |
80 | |||||||||||||
GeForce GTX 1070 Max-Q (Notebook)[67][71] |
Jun 27, 2017 | GP104 (N17E-G2) |
7.2 | 314 | 2048:128:64 | 16 | 2.0 | 1101–1215 | 1265–1379 | 77.76 | 155.5 | 8 | 256 | 256 | 4509–4977 (5181–5648) |
140.9–155.5 (161.9–176.5) |
70.46–77.76 (80.96–88.26) |
80-90 | ||||
GeForce GTX 1070 (Notebook)[67][71] |
Aug 16, 2016 | 1442 | 1645 | 92.29 | 184.6 | 5906 (6738) |
184.6 (210.6) |
92.29 (105.3) |
115 | |||||||||||||
GeForce GTX 1080 Max-Q (Notebook)[67][72] |
Jun 27, 2017 | GP104 (N17E-G3) |
2560:160:64 | 20 | 1101–1290 | 1278–1458 | 10000 | 82.56 | 206.4 | 320 | GDDR5X | 5637–6605 (6543–7465) |
176.2–206.4 (204.5–233.3) |
88.08–103.2 (102.2–116.6) |
90-110 | |||||||
GeForce GTX 1080 (Notebook)[67][72] |
Aug 16, 2016 | 1556 | 1733 | 99.58 | 249.0 | 7967 (8873) |
249.0 (277.3) |
124.5 (138.6) |
150 |
- Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.
- For calculating the processing power, see the Performance subsection of the Pascal architecture article.
Discontinued support
Nvidia stopped releasing 32-bit drivers for 32-bit operating systems after driver 391.35 in March 2018.[73]
Nvidia announced that after release of the 470 drivers, it would transition driver support for the Windows 7 and Windows 8.1 operating systems to legacy status and continue to provide critical security updates for these operating systems through September 2024.[74] The GeForce 10 series is the last Nvidia GPU generation to support Windows 7/8.x or any 32-bit operating system; beginning with the Turing architecture, newer Nvidia GPUs now require a 64-bit operating system.
See also
References
External links
Wikiwand in your browser!
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.