Theta Network and AWS Transform Academic AI with Yonsei University’s Adoption of Trainium-Powered EdgeCloud

Theta Network and AWS recently joined forces to launch custom Amazon AI chips Trainium & Inferentia on the EdgeCloud platform. We’re proud to share that one of Korea’s most prestigious institutions, Yonsei University’s Data & Language Intelligence Lab, led by Professor Dongha Lee, will be utilizing the Theta-AWS Trainium infrastructure for a major AI agent project.
This marks a significant milestone as Theta Network welcomes Yonsei University as a marquee institution-level user of Trainium-powered instances, reflecting our commitment to offering diverse GPU and next-gen AI chip resources tailored to customer needs. This follows AWS’ approval of Theta EdgeCloud hybrid as the first decentralized AI platform to integrate its cutting-edge AI silicon.
Theta is the first blockchain network to deploy Amazon’s next-generation chipsets and deliver unmatched performance for AI, video, and media workloads.
Yonsei University and Theta EdgeCloud Background
The Data & Language Intelligence Lab from Yonsei University joined as one of the earliest adopters of Theta EdgeCloud nearly a year ago, shortly after our platform’s launch, and has continued to leverage our decentralized, high-performance infrastructure for cutting-edge AI research.
“Theta EdgeCloud has been an integral part of our research infrastructure over the past year. It’s been fantastic to collaborate closely with Theta and AWS in the past few months, and with the addition of AWS Trainium, we can now scale our experiments faster, more efficiently, and with greater reproducibility. This enables us to push the boundaries of conversational AI and recommendation systems in ways that were previously not practical.” Said Professor Dongha Lee.
Over the past year, under Professor Lee’s leadership, the lab has achieved remarkable recognition in the global AI community, further strengthened by their extensive use of Theta EdgeCloud Hybrid to power and scale their research. Their accomplishments include:
- Certificate of Excellence in Reviewing from the prestigious KDD conference.
- Outstanding Paper Award at ACL 2024 for “Can Large Language Model be Good Emotional Supporter? Mitigating Preference Bias on Emotional Support Conversation”.
- Over a dozen papers accepted at prestigious conferences including EMNLP 2024, NeurIPS 2024, NAACL 2025, ICLR 2025, SIGIR 2025, ACL 2025, COLM 2025, and more.
These milestones reflect the lab’s deep expertise in natural language processing, large language models, and human-centered AI and we’re honored that they will now pioneer their next phase of research on AWS Trainium-powered Theta EdgeCloud hybrid.
Yonsei’s Research with AWS Trainaium and EdgeCloud Hybrid
Using AWS Trainium Trn1 instances, purpose-built for high-performance deep learning training at half the cost, Yonsei will pioneer a scalable, reproducible framework for training and evaluating conversational recommendation (CR) agents.
Instead of relying on costly and inconsistent human evaluations, the lab will employ high-fidelity AI-simulated users with memory, personality traits, and evolving preferences, alongside automated multi-dimensional evaluation models that measure coherence, informativeness, and persona alignment in real time.
The system will refine chatbot models through Direct Preference Optimization (DPO), using pairwise preference signals from simulated conversations without the need for manual labeling.
Running entirely on AWS Trainium via the Neuron Kernel Interface (NKI) deployed on Theta EdgeCloud Hybrid, this framework can simulate millions of realistic user interactions per day, evaluate and improve models instantly within a hardware-optimized loop, and ensure deterministic, reproducible training at scale.
This vastly accelerates AI R&D cycles in academia. Beyond recommendation systems, it can be extended to support customer service bots, task planners, tutoring systems, and other goal-oriented conversational AI agents, amplifying its potential across industries.
Why It Matters
This collaboration is significant because it combines three industry firsts:
- First decentralized platform approved by AWS to integrate its custom AI silicon Trainium and Inferentia.
- First blockchain network to deploy Amazon’s next-generation AI chipsets for real-world workloads.
- First institution-level customer to adopt Trainium-powered Theta EdgeCloud hybrid for advanced AI research.
By bringing AWS Trainium’s high-performance, cost-efficient capabilities together with Theta EdgeCloud’s globally distributed network of over 30,000 NVIDIA GPUs, researchers now gain unprecedented flexibility to choose the right compute for the right workload – whether training billion-parameter language models, running large-scale inference, or powering next-generation generative AI applications.
“Yonsei University’s adoption of AWS Trainium on EdgeCloud hybrid is a perfect example of how decentralized blockchain infrastructure and cutting-edge AI hardware can work hand-in-hand to accelerate world-class research. Professor Lee and his team have consistently produced groundbreaking work, and we’re proud to provide the GPU compute power that will help them achieve even greater milestones. This collaboration sets a benchmark for how academic institutions can leverage decentralized cloud technology to lead the next wave of AI innovation.” said Mitch Liu, Co-founder and CEO of Theta Labs.
This is not just about faster AI training; it’s about changing the economics and accessibility of AI research. Institutions like Yonsei can now train and optimize complex models at a fraction of the cost, with reproducibility and scale that traditional cloud or on-premise solutions cannot match.