Data Visualization with Python and JavaScript: Scrape, Clean, Explore, and Transform Your Data, 2/e (使用 Python 和 JavaScript 的數據視覺化:擷取、清理、探索與轉換數據,第二版)

Dale, Kyran

買這商品的人也買了...

相關主題

商品描述

How do you turn raw, unprocessed, or malformed data into dynamic, interactive web visualizations? In this practical book, author Kyran Dale shows data scientists and analysts--as well as Python and JavaScript developers--how to create the ideal toolchain for the job. By providing engaging examples and stressing hard-earned best practices, this guide teaches you how to leverage the power of best-of-breed Python and JavaScript libraries.

Python provides accessible, powerful, and mature libraries for scraping, cleaning, and processing data. And while JavaScript is the best language when it comes to programming web visualizations, its data processing abilities can't compare with Python's. Together, these two languages are a perfect complement for creating a modern web-visualization toolchain. This book gets you started.

You'll learn how to:

- Obtain data you need programmatically, using scraping tools or web APIs: Requests, Scrapy, Beautiful Soup
- Clean and process data using Python's heavyweight data processing libraries within the NumPy ecosystem: Jupyter notebooks with pandas+Matplotlib+Seaborn
- Deliver the data to a browser with static files or by using Flask, the lightweight Python server, and a RESTful API
- Pick up enough web development skills (HTML, CSS, JS) to get your visualized data on the web
- Use the data you've mined and refined to create web charts and visualizations with Plotly, D3, Leaflet, and other libraries

商品描述(中文翻譯)

如何將原始、未處理或格式錯誤的數據轉換為動態、互動式的網頁視覺化呢?在這本實用的書中,作者Kyran Dale向數據科學家、分析師以及Python和JavaScript開發人員展示了如何創建理想的工具鏈。通過提供引人入勝的示例並強調經過艱苦摸索的最佳實踐,本指南教你如何充分利用最佳的Python和JavaScript庫的威力。

Python提供了易於使用、功能強大且成熟的庫,用於網頁數據的抓取、清理和處理。而JavaScript在編程網頁視覺化方面是最好的語言,但其數據處理能力無法與Python相比。這兩種語言結合在一起,是創建現代網頁視覺化工具鏈的完美組合。本書將帶你入門。

你將學到如何:
- 使用抓取工具或網頁API以編程方式獲取所需數據:Requests、Scrapy、Beautiful Soup
- 使用Python的重量級數據處理庫(NumPy生態系統中的Jupyter筆記本、pandas+Matplotlib+Seaborn)進行數據清理和處理
- 通過靜態文件或使用輕量級Python服務器Flask和RESTful API將數據傳遞到瀏覽器
- 學習足夠的網頁開發技能(HTML、CSS、JS)將視覺化數據放在網頁上
- 使用挖掘和精煉的數據使用Plotly、D3、Leaflet和其他庫創建網頁圖表和視覺化