LLM Evaluation, Parallel Computing, Demand Forecasting, and Other Hands-On Data Science Approaches

Feeling inspired to write your first TDS post? We’re always open to contributions from new authors.

As we all settle into the sometimes hectic rhythm of a new year, we hope you’ve been enjoying the excitement of kicking off projects, learning about new topics, and exploring your next career moves. We’re definitely seeing a flurry of activity among our authors—both longstanding contributors and recent additions—and are thrilled to share all the great work they’ve been cooking up over the holidays.

Our lineup of top-notch reads this week has a distinctly actionable, hands-on flavor to it—after all, what better way to harness all this energy than by tinkering with some datasets, models, and code? Whether you’re interested in learning more about cutting-edge evaluation methods or building agentic-AI tools, we’ve got you covered with a diverse selection of tutorials and practical overviews. Ready to dive in?

Paradigm Shifts of Eval in the Age of LLMs
Is it time to reevaluate the way we approach evaluations? Lili Jiang believes it is: “I’ve come to recognize that LLMs requires some subtle, conceptually simple, yet important changes in the way we think about evaluation.” Her latest article offers high-level insights into what a new paradigm might look like.The Next Frontier in LLM Accuracy
Staying thematically close to LLM optimization, Mariya Mansurova’s new deep dive unpacks in great detail several methods we can use to increase models’ accuracy, and zooms in on advanced fine-tuning techniques.Photo by Vishal Banik on UnsplashHow to Build a Graph RAG App
Ready to roll up your sleeves and dig deep into some code? Steve Hedden’s thorough tutorial on creating your first graph RAG app is a great option for anyone who’s interested in this trending topic but needs guidance and context to ensure they’re starting off on the right foot.Multi-Agentic RAG with Hugging Face Code Agents
Agent-based systems gained enormous steam (and buzz) last year, and it doesn’t seem like that’s about to change in 2025. Curious to learn more about them? Gabriele Sgroi, PhD’s patient, step-by-step guide may be long, but it remains accessible and clear as it outlines the process of leveraging a “small” LLM to power a multi-agentic system—and produce good results, even on consumer-grade hardware.Demand Forecasting with Darts: A Tutorial
LLMs may be grabbing much of our collective attention these days, but business-focused workflows remain the bread and butter of industry data scientists. Sandra E.G.’s debut TDS article provides a robust, hands-on introduction to one such essential task: demand forecasting in the context of retail sales.Distributed Parallel Computing Made Easy with Ray
It’s crucial for data and ML practitioners to experiment with new tools and frameworks, as seemingly small improvements can accumulate into major cost and efficiency benefits. Betty LD walks us through her recent foray into the AI-focused Ray library for distributed data processing, and demonstrates its power through the use case of scalable offline batch inference.

If you’re ready to branch out into other topics this week, we’re here to help—whether your interests lie at the intersection of music and AI, quantum computing, or linear algebra (among others), we hope you explore some of these excellent articles:

Presenting her team’s cutting-edge research, Tula Masterman explains how we can leverage the hidden state from an intermediate Transformer layer for efficient and robust content safety and prompt injection classification.Music-centered AI tools continue to make strides; Max Hilsdorf devotes his latest exploration of the topic to mono-to-stereo upmixing, a technique for enriching or improving our music-listening experience.With seven first-author publications under her belt, Malak Sadek is well-positioned to offer concrete insights for other researchers who’d like to grow their publishing footprint.Continuing his ongoing series on key linear algebra concepts, Rohit Pandey recently shared a new, comprehensive explainer that looks at the inner workings of orthonormal matrices.Quantum computing has been a hot topic for some time now, though discussions of its promise can sometimes feel almost sci-fi-adjacent. Here to help us make sense of the field and where things stand at the moment is Sara A. Metwalli, whose primer comes right as the UN has declared 2025 the international year of quantum science and technology.While we’re on the topic, let’s round out this week’s selection with Benjamin Assel’s nuanced discussion of error-rate measurement in IBM quantum processors, which also includes code examples, using Qiskit.

Thank you for supporting the work of our authors! As we mentioned above, we love publishing articles from new authors, so if you’ve recently written an interesting project walkthrough, tutorial, or theoretical reflection on any of our core topics, don’t hesitate to share it with us.

Until the next Variable,

TDS Team

LLM Evaluation, Parallel Computing, Demand Forecasting, and Other Hands-On Data Science Approaches was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.

Author:

Leave a Comment

You must be logged in to post a comment.