相關主題
商品描述
Everything you need to know about using the tools, libraries, and models at Hugging Face--from transformers, to RAG, LangChain, and Gradio. Hugging Face in Action reveals how to get the absolute best out of everything Hugging Face, from accessing state-of-the-art models to building intuitive frontends for AI apps. With Hugging Face in Action you'll learn: - Utilizing Hugging Face Transformers and Pipelines for NLP tasks
- Applying Hugging Face techniques for Computer Vision projects
- Manipulating Hugging Face Datasets for efficient data handling
- Training Machine Learning models with AutoTrain functionality
- Implementing AI agents for autonomous task execution
- Developing LLM-based applications using LangChain and LlamaIndex
- Constructing LangChain applications visually with LangFlow
- Creating web-based user interfaces using Gradio
- Building locally running LLM-based applications with GPT4ALL
- Querying local data using Large Language Models Want a cutting edge transformer library? Hugging Face's open source offering is best in class. Need somewhere to host your models? Hugging Face Spaces has you covered. Do your users need an intuitive frontend for your AI app? Hugging Face's Gradio library makes it easy to build UI using the Python skills you already have. In Hugging Face in Action you'll learn how to take full advantage of all of Hugging Face's amazing features to quickly and reliably prototype and productionize AI applications. About the technology Hugging Face is an incredible open-source ecosystem for AI engineers and data scientists, providing hundreds of pre-trained models, datasets, tools, and libraries. It's also a central hub for collaborating on leading edge AI research. Hugging Face is a massive platform, and this book will help you take full advantage of all it has to offer. About the book Hugging Face in Action teaches you how to build end-to-end AI systems using resources from the Hugging Face community. In it, you'll create multiple projects, including an object detection model, a RAG Q&A application, an LLM-powered chatbot, and more. You'll appreciate the clear, accessible explanations, along with thoughtful introductions to key technologies like LangChain, LlamaIndex, and Gradio.
What's inside - How to navigate the huge Hugging Face library of models and tools
- How to run LLMs locally using GPT4ALL
- How to create web-based user interfaces using Gradio
- How to improve models using Hugging Face datasets About the reader For Python programmers familiar with NumPy and Pandas. No AI experience required. About the author Wei-Meng Lee is a technologist and founder of Developer Learning Solutions. Table of Contents 1 Introducing Hugging Face
2 Getting started
3 Using Hugging Face transformers and pipelines for NLP tasks
4 Using Hugging Face for computer vision tasks
5 Exploring, tokenizing, and visualizing Hugging Face datasets
6 Fine-tuning pretrained models and working with multimodal models
7 Creating LLM-based applications using LangChain and LlamaIndex
8 Building LangChain applications visually using Langflow
9 Programming agents
10 Building a web-based UI using Gradio
11 Building locally running LLM-based applications using GPT4All
12 Using LLMs to query your local data
13 Bridging LLMs to the real world with the Model Context Protocol Get a free eBook (PDF or ePub) from Manning as well as access to the online liveBook format (and its AI assistant that will answer your questions in any language) when you purchase the print book.
- Applying Hugging Face techniques for Computer Vision projects
- Manipulating Hugging Face Datasets for efficient data handling
- Training Machine Learning models with AutoTrain functionality
- Implementing AI agents for autonomous task execution
- Developing LLM-based applications using LangChain and LlamaIndex
- Constructing LangChain applications visually with LangFlow
- Creating web-based user interfaces using Gradio
- Building locally running LLM-based applications with GPT4ALL
- Querying local data using Large Language Models Want a cutting edge transformer library? Hugging Face's open source offering is best in class. Need somewhere to host your models? Hugging Face Spaces has you covered. Do your users need an intuitive frontend for your AI app? Hugging Face's Gradio library makes it easy to build UI using the Python skills you already have. In Hugging Face in Action you'll learn how to take full advantage of all of Hugging Face's amazing features to quickly and reliably prototype and productionize AI applications. About the technology Hugging Face is an incredible open-source ecosystem for AI engineers and data scientists, providing hundreds of pre-trained models, datasets, tools, and libraries. It's also a central hub for collaborating on leading edge AI research. Hugging Face is a massive platform, and this book will help you take full advantage of all it has to offer. About the book Hugging Face in Action teaches you how to build end-to-end AI systems using resources from the Hugging Face community. In it, you'll create multiple projects, including an object detection model, a RAG Q&A application, an LLM-powered chatbot, and more. You'll appreciate the clear, accessible explanations, along with thoughtful introductions to key technologies like LangChain, LlamaIndex, and Gradio.
What's inside - How to navigate the huge Hugging Face library of models and tools
- How to run LLMs locally using GPT4ALL
- How to create web-based user interfaces using Gradio
- How to improve models using Hugging Face datasets About the reader For Python programmers familiar with NumPy and Pandas. No AI experience required. About the author Wei-Meng Lee is a technologist and founder of Developer Learning Solutions. Table of Contents 1 Introducing Hugging Face
2 Getting started
3 Using Hugging Face transformers and pipelines for NLP tasks
4 Using Hugging Face for computer vision tasks
5 Exploring, tokenizing, and visualizing Hugging Face datasets
6 Fine-tuning pretrained models and working with multimodal models
7 Creating LLM-based applications using LangChain and LlamaIndex
8 Building LangChain applications visually using Langflow
9 Programming agents
10 Building a web-based UI using Gradio
11 Building locally running LLM-based applications using GPT4All
12 Using LLMs to query your local data
13 Bridging LLMs to the real world with the Model Context Protocol Get a free eBook (PDF or ePub) from Manning as well as access to the online liveBook format (and its AI assistant that will answer your questions in any language) when you purchase the print book.
商品描述(中文翻譯)
使用 Hugging Face 的工具、庫和模型所需了解的一切——從 transformers 到 RAG、LangChain 和 Gradio。
Hugging Face in Action 揭示了如何充分利用 Hugging Face 的所有功能,從訪問最先進的模型到為 AI 應用程式構建直觀的前端。 在 Hugging Face in Action 中,您將學到: - 利用 Hugging Face Transformers 和 Pipelines 進行 NLP 任務- 將 Hugging Face 技術應用於計算機視覺項目
- 操作 Hugging Face Datasets 以高效處理數據
- 使用 AutoTrain 功能訓練機器學習模型
- 實現 AI 代理以自動執行任務
- 使用 LangChain 和 LlamaIndex 開發基於 LLM 的應用程式
- 使用 LangFlow 以視覺化方式構建 LangChain 應用程式
- 使用 Gradio 創建基於網頁的用戶界面
- 使用 GPT4ALL 構建本地運行的基於 LLM 的應用程式
- 使用大型語言模型查詢本地數據 想要一個尖端的 transformer 庫嗎?Hugging Face 的開源產品是同類最佳。需要一個地方來託管您的模型嗎?Hugging Face Spaces 可以滿足您的需求。您的用戶需要一個直觀的 AI 應用前端嗎?Hugging Face 的 Gradio 庫使您能夠輕鬆使用您已經擁有的 Python 技能來構建 UI。在 Hugging Face in Action 中,您將學會如何充分利用 Hugging Face 的所有驚人功能,快速且可靠地原型設計和生產化 AI 應用程式。 關於技術 Hugging Face 是一個令人驚嘆的開源生態系統,為 AI 工程師和數據科學家提供數百個預訓練模型、數據集、工具和庫。它也是一個協作前沿 AI 研究的中心樞紐。Hugging Face 是一個龐大的平台,本書將幫助您充分利用它所提供的一切。 關於本書 Hugging Face in Action 教您如何使用 Hugging Face 社區的資源構建端到端的 AI 系統。在書中,您將創建多個項目,包括物體檢測模型、RAG 問答應用程式、基於 LLM 的聊天機器人等。您將欣賞到清晰、易於理解的解釋,以及對 LangChain、LlamaIndex 和 Gradio 等關鍵技術的深思熟慮的介紹。
內容概覽 - 如何導航 Hugging Face 的龐大模型和工具庫
- 如何使用 GPT4ALL 在本地運行 LLM
- 如何使用 Gradio 創建基於網頁的用戶界面
- 如何使用 Hugging Face 數據集改進模型 讀者對象 適合熟悉 NumPy 和 Pandas 的 Python 程式設計師。無需 AI 經驗。 關於作者 李偉盟 是一位技術專家及 Developer Learning Solutions 的創始人。 目錄 1 介紹 Hugging Face
2 開始使用
3 使用 Hugging Face transformers 和 pipelines 進行 NLP 任務
4 使用 Hugging Face 進行計算機視覺任務
5 探索、標記和可視化 Hugging Face 數據集
6 微調預訓練模型並處理多模態模型
7 使用 LangChain 和 LlamaIndex 創建基於 LLM 的應用程式
8 使用 Langflow 視覺化構建 LangChain 應用程式
9 編程代理
10 使用 Gradio 構建基於網頁的 UI
11 使用 GPT4All 構建本地運行的基於 LLM 的應用程式
12 使用 LLM 查詢您的本地數據
13 通過模型上下文協議將 LLM 連接到現實世界 購買印刷版書籍時,您將獲得 Manning 提供的免費電子書(PDF 或 ePub)以及訪問在線 liveBook 格式(及其 AI 助手,將以任何語言回答您的問題)的權限。
作者簡介
Wei-Meng Lee is a technologist and founder of Developer Learning Solutions, a company specializing in helping companies adopt the latest IT technologies. Wei-Meng provides consultancy services to companies on adopting blockchain and AI solutions for their businesses.
作者簡介(中文翻譯)
魏孟李是一位技術專家,也是Developer Learning Solutions的創辦人,該公司專注於幫助企業採用最新的IT技術。魏孟為企業提供顧問服務,協助他們在業務中採用區塊鏈和人工智慧解決方案。