ML-NIC: accelerating machine learning inference using smart network interface cards
Low-latency inference for machine learning models is increasingly becoming a necessary requirement, as these models are used in mission-critical applications such as autonomous driving, military defense (e.g., target recognition), and network traffic analysis. A widely studied and used technique to...
Saved in:
Main Authors: | Raghav Kapoor, David C. Anastasiu, Sean Choi |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2025-01-01
|
Series: | Frontiers in Computer Science |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fcomp.2024.1493399/full |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Psychiatric Nursing Care Process with NANDA, NIC, and NOC Classifications: Case Example
by: Tuğba Şahin Tokatlıoğlu, et al.
Published: (2024-12-01) -
Possibilities of using of hardware accelerators for intrusion detection and prevention systems
by: Artem Tetskyi, et al.
Published: (2024-11-01) -
An efficient loop tiling framework for convolutional neural network inference accelerators
by: Hongmin Huang, et al.
Published: (2022-01-01) -
Privacy-preserving attribute ticket scheme based on mobile terminal with smart card
by: Rui SHI, et al.
Published: (2022-10-01) -
The language of insults: A look at Theme, Rheme and negative inferences
by: Leong Alvin Ping
Published: (2022-10-01)