Skip to main content
University of North Dakota
University of North Dakota
    • Admitted Students
    • Current Students
    • Families of Current Students
    • Faculty & Staff
    • Alumni
    • Email
    • Blackboard
    • Campus Connection
    • Employee Self-Service
    • Hawk Central
    • Degree Map
    • Zoom
  • Directory
  • Academics
  • Admissions
  • Student Life
  • Research
  • Athletics
  • Majors & Programs
  • About
University of North Dakota
  • Academics
  • Admissions
  • Student Life
  • Research
  • Athletics
  • Majors & Programs
  • About
  • Request Info
  • Visit
  • Apply
  • Request Info
  • Visit
  • Apply
  • Home
  • Research & Economic Development
  • Computational Research Center
  • Clusters And Equipment
  • Talon
Skip Section Navigation
  • Computational Research Center
  • HPC Cluster and DREAM Lab Show/hide children
    • Talon - High Performance Computing Cluster
    • DREAM Lab
  • Tutorials
  • Collaboration & Outreach
  • Support Show/hide children
    • End-User Support
    • Windows Software
  • Research Support

Talon

Skylake Nodes

These are the newest nodes in the Talon cluster. All of these nodes have higher core counts than previous generations with select nodes containing additional memory or GPU accelerators for more demanding workloads.

Quantity Role Model Specs
15

Compute Node

(talon queue)

HP ProLiant DL360 Gen10 

  • Two 64-bit 18-core Intel Xeon Gold 6140 processors (36 cores per node)
  • 192GB of RAM per node
  • Two 960GB SSDs in a RAID 1 (mirrored) configuration
3

Large Memory Node

(talon-large queue)

HP ProLiant DL560 Gen10 

  • Four 64-bit 18-core Intel Xeon Gold 6140 processors (72 cores per node)
  • 3TB of RAM per node
  • Two 960GB SSDs in a RAID 1 (mirrored) configuration
2

GPU Accelerated Deep Learning Node

(talon-gpu32 queue)

HP ProLiant XL270d Gen10 

  • Two 64-bit 18-core Intel Xeon Gold 6140 processors (36 cores per node)
  • 1.5TB of RAM per node
  • 8 NVIDIA Tesla V100 GPUs per node with 32GB of HBM2 VRAM each (5120 CUDA cores per card, 640 Tensor cores per card)
3

Web Services Node

(VM nodes)

HP ProLiant DL360 Gen10 

  • Two 64-bit 18-core Intel Xeon Gold 6140 processors (36 cores per node)
  • 768GB of RAM per node
  • Two 960GB SSDs in a RAID 1 (mirrored) configuration
2

Head Node

HP ProLiant DL360 Gen10 

  • Two 64-bit 10-core Intel Xeon Silver 4114 processors (20 cores per node)
  • 96GB of RAM per node
  • Two 4TB HDDs

Additional Specs

  • Private 1Gbps Administration network
  • Public 10Gbps Ethernet network
  • Private 100Gbps Infiniband EDR Research network for Skylake Nodes
  • 288TB CEPH Storage System
  • DDN GS7990 5PB (5000TB raw, 4300TB usable) GPFS storage system 
  • Red Hat Enterprise Linux 8.8 
Computational Research Center
Chester Fritz Library Room 334
3051 University Ave Stop 8399
Grand Forks, ND 58202-8399
P 701.777.6514
und.hpc.support@UND.edu

Hours

Mon.-Fri.: 9 a.m. - 5 p.m.

We use cookies on this site to enhance your user experience.

By clicking any link on this page you are giving your consent for us to set cookies, Privacy Information.

Ready to Enroll?

  • Request Information
  • Schedule a Visit
  • Apply Now
  • UND.info@UND.edu
  • 701.777.3000
  • Instagram
  • Facebook
  • YouTube
  • LinkedIn
  • X
  • Contact UND
  • Campus Map
  • Events Calendar
  • Community & Belonging
  • Explore Programs
  • Employment
  • Make a Gift
  • Campus Safety (SafeUND)
University of North Dakota

© 2025 University of North Dakota - Grand Forks, ND - Member of ND University System

  • Accessibility & Website Feedback
  • Terms of Use & Privacy
  • Notice of Nondiscrimination
  • Student Disclosure Information
  • Title IX
©