<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Developer Tools on Nitin Kumar Singh</title><link>https://nitinksingh.com/categories/developer-tools/</link><description>Recent content in Developer Tools on Nitin Kumar Singh</description><generator>Hugo -- gohugo.io</generator><language>en</language><copyright>© 2026 Nitin Kumar Singh. All rights reserved.</copyright><lastBuildDate>Sat, 21 Mar 2026 10:00:00 +0530</lastBuildDate><atom:link href="https://nitinksingh.com/categories/developer-tools/index.xml" rel="self" type="application/rss+xml"/><item><title>Building Dictum — a macOS Dictation App with Tauri 2 (Rust + React) &amp; Azure OpenAI</title><link>https://nitinksingh.com/posts/building-dictum-a-macos-dictation-app-with-tauri-2-rust--react-azure-openai/</link><pubDate>Sat, 21 Mar 2026 10:00:00 +0530</pubDate><guid>https://nitinksingh.com/posts/building-dictum-a-macos-dictation-app-with-tauri-2-rust--react-azure-openai/</guid><description>&lt;p&gt;macOS has had built-in dictation since Monterey. It is fine — press and hold a key, speak, done. But it requires Apple&amp;rsquo;s servers (unless you download the enhanced on-device model), only works in some apps, and you have zero control over punctuation, formatting, or hotkeys.&lt;/p&gt;</description></item><item><title>Streamlining AI Development with LiteLLM Proxy: A Comprehensive Guide</title><link>https://nitinksingh.com/posts/streamlining-ai-development-with-litellm-proxy-a-comprehensive-guide/</link><pubDate>Sun, 25 May 2025 09:00:00 +0530</pubDate><guid>https://nitinksingh.com/posts/streamlining-ai-development-with-litellm-proxy-a-comprehensive-guide/</guid><description>&lt;p&gt;In the rapidly evolving landscape of artificial intelligence, development teams face significant challenges when integrating multiple AI models into their workflows. The proliferation of different providers, APIs, and pricing models creates complexity that can slow down innovation and increase technical debt. This article explores a powerful solution: a Docker-based setup combining LiteLLM proxy with Open WebUI that streamlines AI development and provides substantial benefits for teams of all sizes.&lt;/p&gt;</description></item></channel></rss>