دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش:
نویسندگان: Sergio Mendez
سری:
ISBN (شابک) : 1800568592, 9781800568594
ناشر: Packt Publishing
سال نشر: 2022
تعداد صفحات: 458
زبان: English
فرمت فایل : EPUB (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 8 Mb
در صورت تبدیل فایل کتاب Edge Computing Systems with Kubernetes: A use case guide for building edge systems using K3s, k3OS, and open source cloud native technologies به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب سیستمهای محاسبات لبه با Kubernetes: راهنمای استفاده برای ساختن سیستمهای لبه با استفاده از K3s، k3OS و فناوریهای بومی ابر منبع باز نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Understand how to use K3s and k3OS for different use cases and discover best practices for building an edge computing system
Edge computing is a way of processing information near the source of data instead of processing it on data centers in the cloud. In this way, edge computing can reduce latency when data is processed, improving the user experience on real-time data visualization for your applications. Using K3s, a light-weight Kubernetes and k3OS, a K3s-based Linux distribution along with other open source cloud native technologies, you can build reliable edge computing systems without spending a lot of money.
In this book, you will learn how to design edge computing systems with containers and edge devices using sensors, GPS modules, WiFi, LoRa communication and so on. You will also get to grips with different use cases and examples covered in this book, how to solve common use cases for edge computing such as updating your applications using GitOps, reading data from sensors and storing it on SQL and NoSQL databases. Later chapters will show you how to connect hardware to your edge clusters, predict using machine learning, and analyze images with computer vision. All the examples and use cases in this book are designed to run on devices using 64-bit ARM processors, using Raspberry Pi devices as an example.
By the end of this book, you will be able to build your own edge computing systems using the content of the chapters as Lego pieces to fit your needs.
This book is for engineers (developers and/or operators) seeking to bring the cloud native benefits of GitOps and Kubernetes to the edge. Anyone with basic knowledge of Linux and containers looking to learn Kubernetes using examples applied to edge computing and hardware systems will benefit from this book.
Cover Title Page Copyright and Credits Dedication Contributors Table of Contents Preface Part 1: Edge Computing Basics Chapter 1: Edge Computing with Kubernetes Technical requirements Edge data centers using K3s and basic edge computing concepts The edge and edge computing Benefits of edge computing Containers, Docker, and containerd for edge computing Distributed systems, edge computing, and Kubernetes Edge clusters using K3s – a lightweight Kubernetes Edge devices using ARM processors and micro data centers Edge computing diagrams to build your system Edge cluster and public cloud Regional edge clusters and public cloud Single node cluster and public/private cloud Adapting your software to run at the edge Adapting Go to run on ARM Adapting Rust to run on ARM Adapting Python to run on ARM Adapting Java to run on ARM Summary Questions Further reading Chapter 2: K3s Installation and Configuration Technical requirements Introducing K3s and its architecture Preparing your edge environment to run K3s Hardware that you can use Linux distributions for ARM devices Creating K3s single and multi-node clusters Creating a single node K3s cluster using Ubuntu OS Adding more nodes to your K3s cluster for multi-node configuration Extracting K3s kubeconfig to access your cluster Advanced configurations Using external MySQL storage for K3s Installing Helm to install software packages in Kubernetes Changing the default ingress controller Uninstalling K3s from the master node or an agent node Troubleshooting a K3s cluster Summary Questions Further reading Chapter 3: K3s Advanced Configurations and Management Technical requirements Bare metal load balancer with MetalLB Load balancer services in Kubernetes KlipperLB and MetalLB as bare metal load balancers KlipperLB and MetalLB – the goods and the bads Installing MetalLB Troubleshooting MetalLB Setting up Longhorn for storage Why use Longhorn? Installing Longhorn with ReadWriteMany mode Using Longhorn UI Upgrading your cluster Upgrading using K3s Bash scripts Upgrading K3s manually Restarting K3s Backing up and restoring your K3s configurations Backups from SQLite Backups and restoring from the SQL database K3s backend Embedded etcd management Installing the etcd backend Creating and restoring etcd snapshots Summary Questions Further reading Chapter 4: k3OS Installation and Configurations Technical requirements k3OS – the Kubernetes operating system k3OS installation for x86_64 devices using an ISO image Advanced installations of k3OS using config files k3OS config file sections Configurations for master and agent nodes Multi-node cluster creation using config files Creating a multi-node K3s cluster using config files Multi-node ARM overlay installation Master node overlay installation Summary Questions Further reading Chapter 5: K3s Homelab for Edge Computing Experiments Technical requirements Installing a multi-node K3s cluster on your local network Installing an Ubuntu image on your Raspberry device Configuring your Raspberry Pi to run the K3s installer Configuring the K3s master node Configuring the K3s agent nodes Installing MetalLB as the load balancing service Installing Longhorn with ReadWriteMany mode Extracting the K3s kubeconfig file to access your cluster Deploying your first application with kubectl Basic Kubernetes objects Deploying a simple NGINX server with pods using kubectl Deploying a Redis NoSQL database with pods Deploying and scaling an NGINX server with deployments Deploying a simple NGINX server using YAML files Deploying an NGINX server using a Pod Deploying an NGINX server using deployment Exposing your pods using the ClusterIP service and YAML files Exposing your pods using the NodePort service and YAML files Exposing your pods using a LoadBalancer service and YAML files Adding persistence to your applications Creating an NGINX pod with a storage volume Creating the database using a persistent volume Deploying a Kubernetes dashboard Summary Questions Further reading Part 2: Cloud-Native Applications at the Edge Chapter 6: Exposing Your Applications Using Ingress Controllers and Certificates Technical requirements Understanding ingress controllers Installing Helm for ingress controller installations Installing cert-manager NGINX ingress installation Using NGINX to expose your applications Using Traefik to expose your applications Contour ingress controller installation and use Using Contour with HTTPProxy and cert-manager Troubleshooting your ingress controllers Pros and cons of Traefik, NGINX, and Contour Tips and best practices for ingress controllers Summary Questions Further reading Chapter 7: GitOps with Flux for Edge Applications Technical requirements Implementing GitOps for edge computing GitOps principles GitOps benefits GitOps, cloud native, and edge computing Flux and its architecture Designing GitOps with Flux for edge applications Creating a simple monorepo for GitOps Understanding the application and GitHub Actions Building your container image with GitHub Actions Installing and configuring Flux for GitOps Troubleshooting Flux installations Installing Flux monitoring dashboards Uninstalling Flux Summary Questions Further reading Chapter 8: Observability and Traffic Splitting Using Linkerd Technical requirements Observability, monitoring, and analytics Golden metrics Introduction to service meshes and Linkerd Linkerd service mesh Implementing observability and traffic splitting with Linkerd Installing Linkerd in your cluster Installing and injecting the NGINX ingress controller Creating a demo application and faulty pods Testing observability and traffic splitting with Linkerd Using Linkerd’s CLI Uninstalling Linkerd Ideas to implement when using service meshes Summary Questions Further reading Chapter 9: Edge Serverless and Event-Driven Architectures with Knative and Cloud Events Technical requirements Serverless at the edge with Knative and Cloud Events Implementing serverless functions using Knative Serving Installing Knative Serving Creating a simple serverless function Implementing a serverless API using traffic splitting with Knative Using declarative files in Knative Implementing events and event-driven pipelines using sequences with Knative Eventing Installing Knative Eventing Implementing a simple event Using sequences to implement event-driven pipelines Summary Questions Further reading Chapter 10: SQL and NoSQL Databases at the Edge Technical requirements CAP theorem for SQL and NoSQL databases Creating a volume to persist your data Using MySQL and MariaDB SQL databases Using a Redis key-value NoSQL database Using a MongoDB document-oriented NoSQL database Using a PostgreSQL object-relational and SQL database Using a Neo4j graph NoSQL database Summary Questions Further reading Part 3: Edge Computing Use Cases in Practice Chapter 11: Monitoring the Edge with Prometheus and Grafana Technical requirements Monitoring edge environments Deploying Redis to persist Mosquitto sensor data Installing Mosquitto to process sensor data Processing Mosquitto topics Installing Prometheus, a time series database Deploying a custom exporter for Prometheus Configuring a DHT11 sensor to send humidity and temperature weather data Installing Grafana to create dashboards Summary Questions Further reading Chapter 12: Communicating with Edge Devices across Long Distances Using LoRa Technical requirements LoRa wireless protocol and edge computing Deploying MySQL to store sensor data Deploying a service to store sensor data in a MySQL database Programming the ESP32 microcontroller to send sensor data Configuring Heltec ESP32 + LoRa to read DHT11 sensor data Installing the USB to UART bridge driver Installing Arduino IDE Troubleshooting Arduino IDE when using Heltec ESP32 + LoRa Uploading code to the ESP32 microcontroller to send sensor data Programming the ESP32 microcontroller to receive sensor data Visualizing data from ESP32 microcontrollers using MySQL and Grafana Summary Questions Further reading Chapter 13: Geolocalization Applications Using GPS, NoSQL, and K3s Clusters Technical requirements Understanding how GPS is used in a geo-tracking system Using Redis to store GPS coordinates data Using MongoDB to store your device’s tracking data Creating services to monitor your devices in real time using GPS Deploying gps-server to store GPS coordinates Creating a service to log GPS positions and enable real-time tracking for your devices Deploying tracking-server to store logs from GPS coordinates to be used for vehicles routing report Configuring your Raspberry Pi to track your device using GPS Understanding the GPS reader code to send GPS coordinates Deploying gps-reader to send GPS coordinates to the cloud Visualizing your devices using Open Street Maps in real time Understanding the geo-tracking map visualizer code Understanding the vehicles routes report Deploying a real-time map and report application to track your devices Summary Questions Further reading Chapter 14: Computer Vision with Python and K3s Clusters Technical requirements Computer vision and smart traffic systems Using Redis to store temporary object GPS positions Deploying a computer vision service to detect car obstacles using OpenCV, TensorFlow Lite, and scikit-learn Preparing your Raspberry Pi to run the computer vision application Deploying the inference service to detect objects Deploying the gps-queue service to store GPS coordinates Deploying traffic-manager to store GPS coordinates Deploying a simple proxy to bypass CORS Deploying the edge application to visualize warnings based on computer vision Installing the Traffic Map application to visualize objects detected by drivers Detecting objects with computer vision using OpenCV, TensorFlow Lite, and scikit-learn Deploying a global visualizer for the smart traffic system Summary Questions Further reading Chapter 15: Designing Your Own Edge Computing System Using the edge computing system design canvas Purpose Features Challenges People Costs Automation Data Security Edge Devices Sensors Cloud Communication Metrics Using managed services from cloud providers Existing hardware for your projects Exploring complementary software for your system Recommendations to build your edge computing system Exploring additional edge computing use cases Summary Questions Further reading About Packt Other Books You May Enjoy Index