The Next Leap in Robotics: A Technical Deep Dive into Google DeepMind’s Web-Enabled Agents
5 mins read

The Next Leap in Robotics: A Technical Deep Dive into Google DeepMind’s Web-Enabled Agents

    Planner: This component takes a high-level goal (e.g., “Find a healthy beverage to soothe a sore throat”) and breaks it down into a sequence of smaller, actionable steps. This is typically powered by a sophisticated LLM, and the latest OpenAI News and Anthropic News show that models are becoming increasingly adept at this kind of reasoning and task decomposition. The output might be a list of tasks, such as `[FetchTask: “search for remedies for a sore throat”, ActTask: “go to the kitchen”, ActTask: “prepare a cup of tea with honey”]`.Fetcher: This component is responsible for executing information-gathering tasks. It interfaces with external systems, most commonly web search APIs, but could also query a vector database like Pinecone News or Weaviate News for internal knowledge. Frameworks like LangChain News and LlamaIndex News are often used to orchestrate these data retrieval and synthesis steps.Actor: This is the robotics-specific component that translates a digital command into physical motion. It controls the robot’s motors, grippers, and navigation systems to interact with the real world.

Here is a practical Go code snippet demonstrating how we can define these roles using an interface and concrete structs. This approach makes the system flexible and extensible.

package main

import (
	"fmt"
	"time"
)

// Task defines the common interface for any action our robot can take.
// This allows us to create a pipeline of heterogeneous tasks.
type Task interface {
	Execute() (string, error)
}

// PlanningTask represents the initial phase where an LLM would break down a goal.
type PlanningTask struct {
	Goal string
}

func (t PlanningTask) Execute() (string, error) {
	fmt.Printf("[PLANNER] Decomposing goal: '%s'\n", t.Goal)
	// In a real system, this would call a model from Google, OpenAI, or Cohere.
	time.Sleep(500 * time.Millisecond) // Simulate API call latency
	// The output is a plan, which would inform the next tasks.
	plan := "Plan: 1. Search for 'best tea for sore throat'. 2. Go to kitchen. 3. Prepare tea."
	fmt.Printf("[PLANNER] Generated Plan: %s\n", plan)
	return plan, nil
}

// FetchingTask represents an information gathering step, like a web search.
type FetchingTask struct {
	Query string
}

func (t FetchingTask) Execute() (string, error) {
	fmt.Printf("[FETCHER] Executing web search for: '%s'\n", t.Query)
	// This would use a search API or a tool like Haystack.
	time.Sleep(1 * time.Second) // Simulate network latency
	result := "Honey and lemon tea is highly recommended."
	fmt.Printf("[FETCHER] Found result: %s\n", result)
	return result, nil
}

// ActingTask represents a physical action performed by the robot.
type ActingTask struct {
	Action      string
	ContextInfo string // Information from a previous FetchingTask
}

func (t ActingTask) Execute() (string, error) {
	fmt.Printf("[ACTOR] Performing action: '%s' with context: '%s'\n", t.Action, t.ContextInfo)
	// This would interface with the robot's motor control system.
	time.Sleep(2 * time.Second) // Simulate physical movement
	result := "Action completed successfully."
	fmt.Printf("[ACTOR] %s\n", result)
	return result, nil
}

func main() {
	fmt.Println("--- Robotic Agent Task Execution ---")

	// Create a sequence of tasks based on a hypothetical plan.
	tasks := []Task{
		PlanningTask{Goal: "Soothe my sore throat."},
		FetchingTask{Query: "best tea for sore throat"},
		ActingTask{Action: "Prepare tea", ContextInfo: "Honey and lemon tea is highly recommended."},
	}

	// Execute tasks sequentially.
	for _, task := range tasks {
		_, err := task.Execute()
		if err != nil {
			fmt.Printf("Error executing task: %v\n", err)
			break // Stop on error
		}
	}
}

Building the Engine: Concurrent Information Gathering and Action

A robot operating in the real world cannot be a blocking, single-threaded application. While it’s waiting for a web search to return results (a high-latency I/O operation), it must continue to process sensor data, maintain balance, and respond to its environment. This is where concurrency becomes non-negotiable. Go’s first-class support for concurrency through goroutines (lightweight threads) and channels (typed conduits for communication) makes it an exceptionally well-suited language for this domain. This approach is also seen in high-performance inference servers like Triton Inference Server News, which must handle many requests simultaneously.

A Practical Go Example: The Fetcher Goroutine

Let’s refactor our previous example to make the fetching process non-blocking. The main control loop will delegate the `FetchingTask` to a separate goroutine. It will then use a channel to receive the result whenever it’s ready, without halting all other operations. This pattern is fundamental to building responsive and efficient robotic systems. While the fetcher is working, the robot’s main loop could be processing data from its cameras or other sensors, a task that might itself leverage models optimized with tools like TensorRT News or OpenVINO News for on-device performance.

package main

import (
	"fmt"
	"time"
)

// fetchWebInfo simulates a network call to a search engine or API.
// It takes a query and a channel to send the result back.
func fetchWebInfo(query string, resultsChan chan<- string) {
	fmt.Printf("[FETCHER GOROUTINE] Starting web search for: '%s'\n", query)
	// Simulate network latency. In a real application, this would be an http.Get call.
	time.Sleep(2 * time.Second)
	result := fmt.Sprintf("Web Result for '%s': Chamomile tea is a good option.", query)
	fmt.Println("[FETCHER GOROUTINE] Search complete. Sending result to channel.")
	resultsChan <- result // Send the result back to the main goroutine.
}

func main() {
	fmt.Println("--- Concurrent Robotic Agent ---")

	// A channel to receive the results from our concurrent fetcher.
	// A buffered channel of size 1 means it can hold one result without the sender blocking.
	infoChannel := make(chan string, 1)

	// The robot receives a command to gather information.
	// It launches the fetch operation in a new goroutine so it's not blocked.
	go fetchWebInfo("remedies for headache", infoChannel)

	// The main loop can now do other things while waiting for the info.
	// For example, monitoring battery levels or navigating.
	fmt.Println("[MAIN LOOP] Fetcher launched. Performing other tasks...")
	for i := 0; i < 3; i++ {
		fmt.Println("[MAIN LOOP] ...monitoring environment...")
		time.Sleep(500 * time.Millisecond)
	}

	// Now, wait for the result from the channel.
	// This line will block until a message is available on infoChannel.
	fmt.Println("[MAIN LOOP] Awaiting information from fetcher...")
	fetchedInfo := <-infoChannel

	fmt.Printf("[MAIN LOOP] Received info: '%s'\n", fetchedInfo)
	fmt.Println("[MAIN LOOP] Now planning action based on new info.")
	// The robot can now use this information to perform a physical action.
}

Orchestrating Complex Workflows with Channels and Select

The true power of this architecture is revealed when we chain multiple asynchronous steps together. A complete “plan, fetch, act” cycle is a pipeline: the planner’s output becomes the fetcher’s input, and the fetcher’s output informs the actor. Go channels are perfect for building these pipelines, allowing different stages (each running in its own goroutine) to pass data safely and efficiently. This mirrors the sophisticated data and model pipelines managed by MLOps platforms like MLflow News or cloud services such as Vertex AI News and AWS SageMaker News, but applied to real-time robotics.

Code Example: A Full Plan-Fetch-Act Pipeline

In this advanced example, we’ll create a multi-stage pipeline using three separate goroutines for planning, fetching, and acting. They communicate exclusively through channels. The `main` function orchestrates the setup and injects the initial goal. We also introduce a `select` statement, a powerful Go construct that allows a goroutine to wait on multiple channel operations, making it easy to handle timeouts or other events. This is critical for a robot that can’t afford to wait indefinitely for a single step to complete. This architecture is highly scalable, similar to distributed computing frameworks like Ray News or Dask News, which are designed to handle large-scale parallel processing.

package main

import (
	"fmt"
	"time"
)

// Define structs to carry data between stages
type Plan struct{ SubTasks []string }
type FetchedInfo struct{ Data string }
type ActionResult struct{ Status string }

// plannerGoroutine simulates an LLM creating a plan.
func plannerGoroutine(goal string, planChan chan<- Plan) {
	fmt.Println("[PLANNER] Received goal, thinking...")
	time.Sleep(1 * time.Second)
	// In a real system, this plan would be more dynamic.
	newPlan := Plan{SubTasks: []string{"search:how to water a fern", "action:get watering can"}}
	fmt.Println("[PLANNER] Plan created. Sending to fetcher.")
	planChan <- newPlan
}

// fetcherGoroutine simulates searching the web for information.
func fetcherGoroutine(planChan <-chan Plan, infoChan chan<- FetchedInfo) {
	plan := <-planChan // Wait for a plan
	fmt.Printf("[FETCHER] Received plan. Looking for search tasks. Plan: %v\n", plan.SubTasks)
	
	// Find the search task in the plan
	for _, task := range plan.SubTasks {
		if len(task) > 7 && task[:6] == "search" {
			query := task[7:]
			fmt.Printf("[FETCHER] Found search task. Querying: '%s'\n", query)
			time.Sleep(1 * time.Second) // Simulate network call
			info := FetchedInfo{Data: "Ferns prefer moist, but not soggy, soil."}
			fmt.Println("[FETCHER] Search complete. Sending info to actor.")
			infoChan <- info
			return
		}
	}
}

// actorGoroutine simulates the robot performing a physical action.
func actorGoroutine(infoChan <-chan FetchedInfo, resultChan chan<- ActionResult) {
	info := <-infoChan // Wait for information
	fmt.Printf("[ACTOR] Received info: '%s'. Preparing to act.\n", info.Data)
	time.Sleep(2 * time.Second) // Simulate physical action
	fmt.Println("[ACTOR] Action complete.")
	resultChan <- ActionResult{Status: "Watering complete."}
}

func main() {
	goal := "Water the plant."

	// Create the channels that connect the pipeline stages.
	planChannel := make(chan Plan)
	infoChannel := make(chan FetchedInfo)
	resultChannel := make(chan ActionResult)

	// Start the pipeline goroutines.
	go plannerGoroutine(goal, planChannel)
	go fetcherGoroutine(planChannel, infoChannel)
	go actorGoroutine(infoChannel, resultChannel)

	fmt.Printf("--- Full Pipeline Initiated for Goal: '%s' ---\n", goal)

	// Use a select statement to wait for the final result or a timeout.
	select {
	case result := <-resultChannel:
		fmt.Printf("--- PIPELINE SUCCESS: %s ---\n", result.Status)
	case <-time.After(10 * time.Second):
		fmt.Println("--- PIPELINE TIMEOUT: The task took too long to complete. ---")
	}
}

Best Practices and Integration with the AI Ecosystem

Building a production-ready robotic agent involves more than just a concurrent pipeline. It requires careful consideration of error handling, model optimization, and integration with the broader machine learning ecosystem.

Error Handling and Resilience

What happens if a web search fails due to a network error? Or if the robot’s gripper fails to pick up an object? A robust system must anticipate these failures. In Go, this can be handled by passing error values through channels alongside results, or by using the `context` package to signal cancellation across multiple goroutines. This ensures that a failure in one part of the pipeline doesn’t cause the entire system to hang or crash.

Model Integration and Optimization

The field of artificial intelligence is witnessing a remarkable convergence of disciplines. Recent breakthroughs, particularly highlighted in the latest Google DeepMind News, are erasing the lines between large language models (LLMs), information retrieval, and physical robotics. We are moving beyond robots that execute pre-programmed, rigid instructions. The new frontier is embodied AI—agents that can understand high-level human commands, autonomously search for necessary information online, formulate a plan, and then execute that plan in the physical world. This “plan, fetch, act” paradigm represents a monumental shift, promising to unlock unprecedented capabilities for robots in both domestic and industrial settings. This article provides a comprehensive technical exploration of the software architecture behind such systems, using the Go programming language to illustrate the core concepts of concurrency and modularity that are essential for building these intelligent agents.

The Architectural Blueprint: Deconstructing the Plan-Fetch-Act Cycle

At the heart of these web-enabled robotic systems is a simple yet powerful cognitive loop: Plan, Fetch, and Act. This cycle mimics human problem-solving. When faced with an unfamiliar task, we first create a mental plan, then gather missing information (often by searching the web), and finally, we act on that information. To build a robust software system that mirrors this, we must first define its core components and their interactions. A well-structured architecture not only ensures reliability but also allows for individual components to be upgraded independently—for instance, swapping out the underlying language model without rewriting the entire robotic control logic.

Defining the Core Components with Go Interfaces

In software engineering, particularly in a language like Go, interfaces are the cornerstone of modular and testable design. They define a contract of behavior without specifying the implementation. For our robotic agent, we can define a central Task interface that all components in our cycle must adhere to. This allows the main control loop to handle any type of task—be it planning, fetching, or acting—in a uniform way.

Let’s model this in Go:

    Planner: This component takes a high-level goal (e.g., “Find a healthy beverage to soothe a sore throat”) and breaks it down into a sequence of smaller, actionable steps. This is typically powered by a sophisticated LLM, and the latest OpenAI News and Anthropic News show that models are becoming increasingly adept at this kind of reasoning and task decomposition. The output might be a list of tasks, such as `[FetchTask: “search for remedies for a sore throat”, ActTask: “go to the kitchen”, ActTask: “prepare a cup of tea with honey”]`.Fetcher: This component is responsible for executing information-gathering tasks. It interfaces with external systems, most commonly web search APIs, but could also query a vector database like Pinecone News or Weaviate News for internal knowledge. Frameworks like LangChain News and LlamaIndex News are often used to orchestrate these data retrieval and synthesis steps.Actor: This is the robotics-specific component that translates a digital command into physical motion. It controls the robot’s motors, grippers, and navigation systems to interact with the real world.

Here is a practical Go code snippet demonstrating how we can define these roles using an interface and concrete structs. This approach makes the system flexible and extensible.

package main

import (
	"fmt"
	"time"
)

// Task defines the common interface for any action our robot can take.
// This allows us to create a pipeline of heterogeneous tasks.
type Task interface {
	Execute() (string, error)
}

// PlanningTask represents the initial phase where an LLM would break down a goal.
type PlanningTask struct {
	Goal string
}

func (t PlanningTask) Execute() (string, error) {
	fmt.Printf("[PLANNER] Decomposing goal: '%s'\n", t.Goal)
	// In a real system, this would call a model from Google, OpenAI, or Cohere.
	time.Sleep(500 * time.Millisecond) // Simulate API call latency
	// The output is a plan, which would inform the next tasks.
	plan := "Plan: 1. Search for 'best tea for sore throat'. 2. Go to kitchen. 3. Prepare tea."
	fmt.Printf("[PLANNER] Generated Plan: %s\n", plan)
	return plan, nil
}

// FetchingTask represents an information gathering step, like a web search.
type FetchingTask struct {
	Query string
}

func (t FetchingTask) Execute() (string, error) {
	fmt.Printf("[FETCHER] Executing web search for: '%s'\n", t.Query)
	// This would use a search API or a tool like Haystack.
	time.Sleep(1 * time.Second) // Simulate network latency
	result := "Honey and lemon tea is highly recommended."
	fmt.Printf("[FETCHER] Found result: %s\n", result)
	return result, nil
}

// ActingTask represents a physical action performed by the robot.
type ActingTask struct {
	Action      string
	ContextInfo string // Information from a previous FetchingTask
}

func (t ActingTask) Execute() (string, error) {
	fmt.Printf("[ACTOR] Performing action: '%s' with context: '%s'\n", t.Action, t.ContextInfo)
	// This would interface with the robot's motor control system.
	time.Sleep(2 * time.Second) // Simulate physical movement
	result := "Action completed successfully."
	fmt.Printf("[ACTOR] %s\n", result)
	return result, nil
}

func main() {
	fmt.Println("--- Robotic Agent Task Execution ---")

	// Create a sequence of tasks based on a hypothetical plan.
	tasks := []Task{
		PlanningTask{Goal: "Soothe my sore throat."},
		FetchingTask{Query: "best tea for sore throat"},
		ActingTask{Action: "Prepare tea", ContextInfo: "Honey and lemon tea is highly recommended."},
	}

	// Execute tasks sequentially.
	for _, task := range tasks {
		_, err := task.Execute()
		if err != nil {
			fmt.Printf("Error executing task: %v\n", err)
			break // Stop on error
		}
	}
}

Building the Engine: Concurrent Information Gathering and Action

A robot operating in the real world cannot be a blocking, single-threaded application. While it’s waiting for a web search to return results (a high-latency I/O operation), it must continue to process sensor data, maintain balance, and respond to its environment. This is where concurrency becomes non-negotiable. Go’s first-class support for concurrency through goroutines (lightweight threads) and channels (typed conduits for communication) makes it an exceptionally well-suited language for this domain. This approach is also seen in high-performance inference servers like Triton Inference Server News, which must handle many requests simultaneously.

A Practical Go Example: The Fetcher Goroutine

Let’s refactor our previous example to make the fetching process non-blocking. The main control loop will delegate the `FetchingTask` to a separate goroutine. It will then use a channel to receive the result whenever it’s ready, without halting all other operations. This pattern is fundamental to building responsive and efficient robotic systems. While the fetcher is working, the robot’s main loop could be processing data from its cameras or other sensors, a task that might itself leverage models optimized with tools like TensorRT News or OpenVINO News for on-device performance.

package main

import (
	"fmt"
	"time"
)

// fetchWebInfo simulates a network call to a search engine or API.
// It takes a query and a channel to send the result back.
func fetchWebInfo(query string, resultsChan chan<- string) {
	fmt.Printf("[FETCHER GOROUTINE] Starting web search for: '%s'\n", query)
	// Simulate network latency. In a real application, this would be an http.Get call.
	time.Sleep(2 * time.Second)
	result := fmt.Sprintf("Web Result for '%s': Chamomile tea is a good option.", query)
	fmt.Println("[FETCHER GOROUTINE] Search complete. Sending result to channel.")
	resultsChan <- result // Send the result back to the main goroutine.
}

func main() {
	fmt.Println("--- Concurrent Robotic Agent ---")

	// A channel to receive the results from our concurrent fetcher.
	// A buffered channel of size 1 means it can hold one result without the sender blocking.
	infoChannel := make(chan string, 1)

	// The robot receives a command to gather information.
	// It launches the fetch operation in a new goroutine so it's not blocked.
	go fetchWebInfo("remedies for headache", infoChannel)

	// The main loop can now do other things while waiting for the info.
	// For example, monitoring battery levels or navigating.
	fmt.Println("[MAIN LOOP] Fetcher launched. Performing other tasks...")
	for i := 0; i < 3; i++ {
		fmt.Println("[MAIN LOOP] ...monitoring environment...")
		time.Sleep(500 * time.Millisecond)
	}

	// Now, wait for the result from the channel.
	// This line will block until a message is available on infoChannel.
	fmt.Println("[MAIN LOOP] Awaiting information from fetcher...")
	fetchedInfo := <-infoChannel

	fmt.Printf("[MAIN LOOP] Received info: '%s'\n", fetchedInfo)
	fmt.Println("[MAIN LOOP] Now planning action based on new info.")
	// The robot can now use this information to perform a physical action.
}

Orchestrating Complex Workflows with Channels and Select

The true power of this architecture is revealed when we chain multiple asynchronous steps together. A complete “plan, fetch, act” cycle is a pipeline: the planner’s output becomes the fetcher’s input, and the fetcher’s output informs the actor. Go channels are perfect for building these pipelines, allowing different stages (each running in its own goroutine) to pass data safely and efficiently. This mirrors the sophisticated data and model pipelines managed by MLOps platforms like MLflow News or cloud services such as Vertex AI News and AWS SageMaker News, but applied to real-time robotics.

Code Example: A Full Plan-Fetch-Act Pipeline

In this advanced example, we’ll create a multi-stage pipeline using three separate goroutines for planning, fetching, and acting. They communicate exclusively through channels. The `main` function orchestrates the setup and injects the initial goal. We also introduce a `select` statement, a powerful Go construct that allows a goroutine to wait on multiple channel operations, making it easy to handle timeouts or other events. This is critical for a robot that can’t afford to wait indefinitely for a single step to complete. This architecture is highly scalable, similar to distributed computing frameworks like Ray News or Dask News, which are designed to handle large-scale parallel processing.

package main

import (
	"fmt"
	"time"
)

// Define structs to carry data between stages
type Plan struct{ SubTasks []string }
type FetchedInfo struct{ Data string }
type ActionResult struct{ Status string }

// plannerGoroutine simulates an LLM creating a plan.
func plannerGoroutine(goal string, planChan chan<- Plan) {
	fmt.Println("[PLANNER] Received goal, thinking...")
	time.Sleep(1 * time.Second)
	// In a real system, this plan would be more dynamic.
	newPlan := Plan{SubTasks: []string{"search:how to water a fern", "action:get watering can"}}
	fmt.Println("[PLANNER] Plan created. Sending to fetcher.")
	planChan <- newPlan
}

// fetcherGoroutine simulates searching the web for information.
func fetcherGoroutine(planChan <-chan Plan, infoChan chan<- FetchedInfo) {
	plan := <-planChan // Wait for a plan
	fmt.Printf("[FETCHER] Received plan. Looking for search tasks. Plan: %v\n", plan.SubTasks)
	
	// Find the search task in the plan
	for _, task := range plan.SubTasks {
		if len(task) > 7 && task[:6] == "search" {
			query := task[7:]
			fmt.Printf("[FETCHER] Found search task. Querying: '%s'\n", query)
			time.Sleep(1 * time.Second) // Simulate network call
			info := FetchedInfo{Data: "Ferns prefer moist, but not soggy, soil."}
			fmt.Println("[FETCHER] Search complete. Sending info to actor.")
			infoChan <- info
			return
		}
	}
}

// actorGoroutine simulates the robot performing a physical action.
func actorGoroutine(infoChan <-chan FetchedInfo, resultChan chan<- ActionResult) {
	info := <-infoChan // Wait for information
	fmt.Printf("[ACTOR] Received info: '%s'. Preparing to act.\n", info.Data)
	time.Sleep(2 * time.Second) // Simulate physical action
	fmt.Println("[ACTOR] Action complete.")
	resultChan <- ActionResult{Status: "Watering complete."}
}

func main() {
	goal := "Water the plant."

	// Create the channels that connect the pipeline stages.
	planChannel := make(chan Plan)
	infoChannel := make(chan FetchedInfo)
	resultChannel := make(chan ActionResult)

	// Start the pipeline goroutines.
	go plannerGoroutine(goal, planChannel)
	go fetcherGoroutine(planChannel, infoChannel)
	go actorGoroutine(infoChannel, resultChannel)

	fmt.Printf("--- Full Pipeline Initiated for Goal: '%s' ---\n", goal)

	// Use a select statement to wait for the final result or a timeout.
	select {
	case result := <-resultChannel:
		fmt.Printf("--- PIPELINE SUCCESS: %s ---\n", result.Status)
	case <-time.After(10 * time.Second):
		fmt.Println("--- PIPELINE TIMEOUT: The task took too long to complete. ---")
	}
}

Best Practices and Integration with the AI Ecosystem

Building a production-ready robotic agent involves more than just a concurrent pipeline. It requires careful consideration of error handling, model optimization, and integration with the broader machine learning ecosystem.

Error Handling and Resilience

What happens if a web search fails due to a network error? Or if the robot’s gripper fails to pick up an object? A robust system must anticipate these failures. In Go, this can be handled by passing error values through channels alongside results, or by using the `context` package to signal cancellation across multiple goroutines. This ensures that a failure in one part of the pipeline doesn’t cause the entire system to hang or crash.

Model Integration and Optimization

The Data Flywheel

Perhaps the most exciting aspect of these systems is the potential for a “data flywheel.” Every task the robot attempts—whether successful or not—generates valuable data. This data (video feeds, sensor readings, action sequences, outcomes) can be used to fine-tune the underlying models. A failed attempt to grasp an object becomes a new training example for the grasp-planning model. This continuous loop of action, data collection, and retraining is the key to creating robots that learn and improve over time. This process often involves leveraging vast model repositories like Hugging Face News and powerful training frameworks like DeepSpeed News or platforms like Kaggle News for collaborative model development.

Conclusion: The Dawn of General-Purpose Robotics

The integration of web-scale knowledge with physical robotic action, as pioneered by research from labs like Google DeepMind, is not an incremental improvement; it is a paradigm shift. The “plan, fetch, act” model provides a clear architectural path toward more general-purpose, autonomous robots that can solve a wide array of novel problems. We’ve seen how Go’s concurrency primitives—goroutines, channels, and the select statement—provide an elegant and powerful toolkit for building the complex, non-blocking software required to power these agents.

For developers and engineers, the path forward is clear. The convergence of LLMs, robotics, and concurrent programming opens up a new world of possibilities. The next step is to begin experimenting with these concepts. By combining powerful APIs from the likes of Mistral AI News or Cohere News with robotics simulation environments and the robust concurrency of Go, we can start building the next generation of intelligent machines that can truly understand, reason about, and act upon our world.