Python API for communicating with the CARLA server.
Project description
This is the Python package for the CARLA Python API used for controlling and communicating-with CARLA - The open-source simulator for autonomous driving research.
This package allows you to control the CARLA simulator and retrieve simulation data through the Python API. For instance, you can control any actor (vehicle, pedestrian, traffic light, etc...) in the simulation, attach sensors to vehicles, and read the sensor data etc.
For more information, refer to the Guide for getting started with CARLA.
Usage:
- Install:
pip install carla
- Use:
#!/usr/bin/env python
import carla
...
Python API tutorial
In this tutorial we introduce the basic concepts of the CARLA Python API, as well as an overview of its most important functionalities. The reference of all classes and methods available can be found at Python API reference.
!!! note
This document applies only to the latest development version.
The API has been significantly changed in the latest versions starting at
0.9.0. We commonly refer to the new API as 0.9.X API as opposed to
the previous 0.8.X API.
First of all, we need to introduce a few core concepts:
- Actor: Actor is anything that plays a role in the simulation and can be moved around, examples of actors are vehicles, pedestrians, and sensors.
- Blueprint: Before spawning an actor you need to specify its attributes, and that's what blueprints are for. We provide a blueprint library with the definitions of all the actors available.
- World: The world represents the currently loaded map and contains the functions for converting a blueprint into a living actor, among other. It also provides access to the road map and functions to change the weather conditions.
Connecting and retrieving the world
To connect to a simulator we need to create a "Client" object, to do so we need to provide the IP address and port of a running instance of the simulator
client = carla.Client('localhost', 2000)
The first recommended thing to do right after creating a client instance is setting its time-out. This time-out sets a time limit to all networking operations, if the time-out is not set networking operations may block forever
client.set_timeout(10.0) # seconds
Once we have the client configured we can directly retrieve the world
world = client.get_world()
Typically we won't need the client object anymore, all the objects created by the world will connect to the IP and port provided if they need to. These operations are usually done in the background and are transparent to the user.
Blueprints
A blueprint contains the information necessary to create a new actor. For instance, if the blueprint defines a car, we can change its color here, if it defines a lidar, we can decide here how many channels the lidar will have. A blueprints also has an ID that uniquely identifies it and all the actor instances created with it. Examples of IDs are "vehicle.nissan.patrol" or "sensor.camera.depth".
The list of all available blueprints is kept in the blueprint library
blueprint_library = world.get_blueprint_library()
The library allows us to find specific blueprints by ID, filter them with wildcards, or just choosing one at random
# Find specific blueprint.
collision_sensor_bp = blueprint_library.find('sensor.other.collision')
# Chose a vehicle blueprint at random.
vehicle_bp = random.choice(blueprint_library.filter('vehicle.bmw.*'))
Some of the attributes of the blueprints can be modified while some other are just read-only. For instance, we cannot modify the number of wheels of a vehicle but we can change its color
vehicles = blueprint_library.filter('vehicle.*')
bikes = [x for x in vehicles if int(x.get_attribute('number_of_wheels')) == 2]
for bike in bikes:
bike.set_attribute('color', '255,0,0')
Modifiable attributes also come with a list of recommended values
for attr in blueprint:
if attr.is_modifiable:
blueprint.set_attribute(attr.id, random.choice(attr.recommended_values))
The blueprint system has been designed to ease contributors adding their custom actors directly in Unreal Editor, we'll add a tutorial on this soon, stay tuned!
Spawning actors
Once we have the blueprint set up, spawning an actor is pretty straightforward
transform = Transform(Location(x=230, y=195, z=40), Rotation(yaw=180))
actor = world.spawn_actor(blueprint, transform)
The spawn actor function comes in two flavours, spawn_actor
and
try_spawn_actor
. The former will raise an exception if the actor could not be
spawned, the later will return None
instead. The most typical cause of
failure is collision at spawn point, meaning the actor does not fit at the spot
we chose; probably another vehicle is in that spot or we tried to spawn into a
static object.
To ease the task of finding a spawn location, each map provides a list of recommended transforms
spawn_points = world.get_map().get_spawn_points()
We'll add more on the map object later in this tutorial.
Finally, the spawn functions have an optional argument that controls whether the actor is going to be attached to another actor. This is specially useful for sensors. In the next example, the camera remains rigidly attached to our vehicle during the rest of the simulation
camera = world.spawn_actor(camera_bp, relative_transform, attach_to=my_vehicle)
Note that in this case, the transform provided is treated relative to the parent actor.
Handling actors
Once we have an actor alive in the world, we can move this actor around and check its dynamic properties
location = actor.get_location()
location.z += 10.0
actor.set_location(location)
print(actor.get_acceleration())
print(actor.get_velocity())
We can even freeze an actor by disabling its physics simulation
actor.set_simulate_physics(False)
And once we get tired of an actor we can remove it from the simulation with
actor.destroy()
Note that actors are not cleaned up automatically when the Python script finishes, if we want to get rid of them we need to explicitly destroy them.
!!! important Known issue: To improve performance, most of the methods send requests to the simulator asynchronously. The simulator queues each of these requests, but only has a limited amount of time each update to parse them. If we flood the simulator by calling "set" methods too often, e.g. set_transform, the requests will accumulate a significant lag.
Vehicles
Vehicles are a special type of actor that provide a few extra methods. Apart from the handling methods common to all actors, vehicles can also be controlled by providing throttle, break, and steer values
vehicle.apply_control(carla.VehicleControl(throttle=1.0, steer=-1.0))
These are all the parameters of the VehicleControl
object and their default
values
carla.VehicleControl(
throttle = 0.0
steer = 0.0
brake = 0.0
hand_brake = False
reverse = False
manual_gear_shift = False
gear = 0)
Also, physics control properties can be tuned for vehicles and its wheels
vehicle.apply_physics_control(carla.VehiclePhysicsControl(max_rpm = 5000.0, center_of_mass = carla.Vector3D(0.0, 0.0, 0.0), torque_curve=[[0,400],[5000,400]]))
These properties are controlled through a VehiclePhysicsControl
object, which also contains a property to control each wheel's physics through a WheelPhysicsControl
object.
carla.VehiclePhysicsControl(
torque_curve,
max_rpm,
moi,
damping_rate_full_throttle,
damping_rate_zero_throttle_clutch_engaged,
damping_rate_zero_throttle_clutch_disengaged,
use_gear_autobox,
gear_switch_time,
clutch_strength,
mass,
drag_coefficient,
center_of_mass,
steering_curve,
wheels)
Where:
-
torque_curve: Curve that indicates the torque measured in Nm for a specific revolutions per minute of the vehicle's engine
-
max_rpm: The maximum revolutions per minute of the vehicle's engine
-
moi: The moment of inertia of the vehicle's engine
-
damping_rate_full_throttle: Damping rate when the throttle is maximum.
-
damping_rate_zero_throttle_clutch_engaged: Damping rate when the thottle is zero with clutch engaged
-
damping_rate_zero_throttle_clutch_disengaged: Damping rate when the thottle is zero with clutch disengaged
-
use_gear_autobox: If true, the vehicle will have automatic transmission
-
gear_switch_time: Switching time between gears
-
clutch_strength: The clutch strength of the vehicle. Measured in Kgm^2/s
-
mass: The mass of the vehicle measured in Kg
-
drag_coefficient: Drag coefficient of the vehicle's chassis
-
center_of_mass: The center of mass of the vehicle
-
steering_curve: Curve that indicates the maximum steering for a specific forward speed
-
wheels: List of
WheelPhysicsControl
objects.
carla.WheelPhysicsControl(
tire_friction,
damping_rate,
steer_angle,
disable_steering)
Where:
- tire_friction: Scalar value that indicates the friction of the wheel.
- damping_rate: The damping rate of the wheel.
- steer_angle: The maximum angle in degrees that the wheel can steer.
- disable_steering: If true, the wheel will not steer.
Our vehicles also come with a handy autopilot
vehicle.set_autopilot(True)
As has been a common misconception, we need to clarify that this autopilot control is purely hard-coded into the simulator and it's not based at all in machine learning techniques.
Finally, vehicles also have a bounding box that encapsulates them
box = vehicle.bounding_box
print(box.location) # Location relative to the vehicle.
print(box.extent) # XYZ half-box extents in meters.
Sensors
Sensors are actors that produce a stream of data. Sensors are such a key component of CARLA that they deserve their own documentation page, so here we'll limit ourselves to show a small example of how sensors work
camera_bp = blueprint_library.find('sensor.camera.rgb')
camera = world.spawn_actor(camera_bp, relative_transform, attach_to=my_vehicle)
camera.listen(lambda image: image.save_to_disk('output/%06d.png' % image.frame_number))
In this example we have attached a camera to a vehicle, and told the camera to save to disk each of the images that are going to be generated.
The full list of sensors and their measurement is explained in Cameras and sensors.
Other actors
Apart from vehicles and sensors, there are a few other actors in the world. The full list can be requested to the world with
actor_list = world.get_actors()
The actor list object returned has functions for finding, filtering, and iterating actors
# Find an actor by id.
actor = actor_list.find(id)
# Print the location of all the speed limit signs in the world.
for speed_sign in actor_list.filter('traffic.speed_limit.*'):
print(speed_sign.get_location())
Among the actors you can find in this list are
- Traffic lights with a
state
property to check the light's current state. - Speed limit signs with the speed codified in their type_id.
- The Spectator actor that can be used to move the view of the simulator window.
Changing the weather
The lighting and weather conditions can be requested and changed with the world object
weather = carla.WeatherParameters(
cloudyness=80.0,
precipitation=30.0,
sun_altitude_angle=70.0)
world.set_weather(weather)
print(world.get_weather())
For convenience, we also provided a list of predefined weather presets that can be directly applied to the world
world.set_weather(carla.WeatherParameters.WetCloudySunset)
The full list of presets can be found in the WeatherParameters reference.
Map and waypoints
One of the key features of CARLA is that our roads are fully annotated. All our maps come accompanied by OpenDrive files that defines the road layout. Furthermore, we provide a higher level API for querying and navigating this information.
These objects were a recent addition to our API and are still in heavy development, we hope to make them much more powerful soon.
Let's start by getting the map of the current world
map = world.get_map()
For starters, the map has a name
attribute that matches the name of the
currently loaded city, e.g. Town01. And, as we've seen before, we can also ask
the map to provide a list of recommended locations for spawning vehicles,
map.get_spawn_points()
.
However, the real power of this map API comes apparent when we introduce waypoints. We can tell the map to give us a waypoint on the road closest to our vehicle
waypoint = map.get_waypoint(vehicle.get_location())
This waypoint's transform
is located on a drivable lane, and it's oriented
according to the road direction at that point.
Waypoints also have function to query the "next" waypoints; this method returns a list of waypoints at a certain distance that can be accessed from this waypoint following the traffic rules. In other words, if a vehicle is placed in this waypoint, give me the list of posible locations that this vehicle can drive to. Let's see a practical example
# Retrieve the closest waypoint.
waypoint = map.get_waypoint(vehicle.get_location())
# Disable physics, in this example we're just teleporting the vehicle.
vehicle.set_simulate_physics(False)
while True:
# Find next waypoint 2 meters ahead.
waypoint = random.choice(waypoint.next(2.0))
# Teleport the vehicle.
vehicle.set_transform(waypoint.transform)
The map object also provides methods for generating in bulk waypoints all over the map at an approximated distance between them
waypoint_list = map.generate_waypoints(2.0)
For routing purposes, it is also possible to retrieve a topology graph of the roads
waypoint_tuple_list = map.get_topology()
this method returns a list of pairs (tuples) of waypoints, for each pair, the first element connects with the second one. Only the minimal set of waypoints to define the topology are generated by this method, only a waypoint for each lane for each road segment in the map.
Finally, to allow access to the whole road information, the map object can be converted to OpenDrive format, and saved to disk as such.
Recording and Replaying system
CARLA includes now a recording and replaying API, that allows to record a simulation in a file and later replay that simulation. The file is written on server side only, and it includes which actors are created or destroyed in the simulation, the state of the traffic lights and the position/orientation of all vehicles and walkers.
All data is written in a binary file on the server, on folder CarlaUE4/Saved. As estimation, a simulation with about 150 actors (50 traffic lights, 100 vehicles) for 1h of recording takes around 200 Mb in size.
To start recording we only need to supply a file name:
client.start_recorder("recording01.log")
To stop the recording, we need to call:
client.stop_recorder()
At any point we can replay a simulation, specifying the filename:
client.replay_file("recording01.log")
The replayer will create and destroy all actors that were recorded, and move all actors and setting the traffic lights as they were working at that moment.
When replaying we have some other options that we can use, the full API call is:
client.replay_file("recording01.log", start, duration, camera)
- start: time we want to start the simulation.
- If the value is positive, it means the number of seconds from the beginning. Ex: a value of 10 will start the simulation at second 10.
- If the value is negative, it means the number of seconds from the end. Ex: a value of -10 will replay only the last 10 seconds of the simulation.
- duration: we can say how many seconds we want to play. If the simulation has not reached the end, then all actors will have autopilot enabled automatically. The intention here is to allow for replaying a piece of a simulation and then let all actors start driving in autopilot again.
- camera: we can specify the Id of an actor and then the camera will follow that actor while replaying. Continue reading to know which Id has an actor.
We can know details about a recorded simulation, using this API:
client.show_recorder_file_info("recording01.log")
The output result is something like this:
Version: 1
Map: Town05
Date: 02/21/19 10:46:20
Frame 1 at 0 seconds
Create 2190: spectator (0) at (-260, -200, 382.001)
Create 2191: traffic.traffic_light (3) at (4255, 10020, 0)
Create 2192: traffic.traffic_light (3) at (4025, 7860, 0)
Create 2193: traffic.traffic_light (3) at (1860, 7975, 0)
Create 2194: traffic.traffic_light (3) at (1915, 10170, 0)
...
Create 2258: traffic.speed_limit.90 (0) at (21651.7, -1347.59, 15)
Create 2259: traffic.speed_limit.90 (0) at (5357, 21457.1, 15)
Create 2260: traffic.speed_limit.90 (0) at (858, 18176.7, 15)
Frame 2 at 0.0254253 seconds
Create 2276: vehicle.mini.cooperst (1) at (4347.63, -8409.51, 120)
number_of_wheels = 4
object_type =
color = 255,241,0
role_name = autopilot
Frame 4 at 0.0758538 seconds
Create 2277: vehicle.diamondback.century (1) at (4017.26, 14489.8, 123.86)
number_of_wheels = 2
object_type =
color = 50,96,242
role_name = autopilot
Frame 6 at 0.122666 seconds
Create 2278: vehicle.seat.leon (1) at (3508.17, 7611.85, 120.002)
number_of_wheels = 4
object_type =
color = 237,237,237
role_name = autopilot
Frame 8 at 0.171718 seconds
Create 2279: vehicle.diamondback.century (1) at (3160, 3020.07, 120.002)
number_of_wheels = 2
object_type =
color = 50,96,242
role_name = autopilot
Frame 10 at 0.219568 seconds
Create 2280: vehicle.bmw.grandtourer (1) at (-5405.99, 3489.52, 125.545)
number_of_wheels = 4
object_type =
color = 0,0,0
role_name = autopilot
Frame 2350 at 60.2805 seconds
Destroy 2276
Frame 2351 at 60.3057 seconds
Destroy 2277
Frame 2352 at 60.3293 seconds
Destroy 2278
Frame 2353 at 60.3531 seconds
Destroy 2279
Frame 2354 at 60.3753 seconds
Destroy 2280
Frames: 2354
Duration: 60.3753 seconds
From here we know the date and the map where the simulation was recorded. Then for each frame that has an event (create or destroy an actor, collisions) it shows that info. For creating actors we see the Id it has and some info about the actor to create. This is the id we need to specify in the camera option when replaying if we want to follow that actor during the replay. At the end we can see the total time of the recording and also the number of frames that were recorded.
In simulations whith a hero actor the collisions are automatically saved, so we can query a recorded file to see if any hero actor had collisions with some other actor. Currently the actor types we can use in the query are these:
- h = Hero
- v = Vehicle
- w = Walker
- t = Traffic light
- o = Other
- a = Any
The collision query needs to know the type of actors involved in the collision. If we don't care we can specify a (any) for both. These are some examples:
- a a: will show all collisions recorded
- v v: will show all collisions between vehicles
- v t: will show all collisions between a vehicle and a traffic light
- v w: will show all collisions between a vehicle and a walker
- v o: will show all collisions between a vehicle and other actor, like static meshes
- h w: will show all collisions between a hero and a walker
Currently only hero actors record the collisions, so first actor will be a hero always.
The API for querying the collisions is:
client.show_recorder_collisions("recording01.log", "a", "a")
The output is something similar to this:
Version: 1
Map: Town05
Date: 02/19/19 15:36:08
Time Types Id Actor 1 Id Actor 2
16 v v 122 vehicle.yamaha.yzf 118 vehicle.dodge_charger.police
27 v o 122 vehicle.yamaha.yzf 0
Frames: 790
Duration: 46 seconds
We can see there for each collision the time when happened, the type of the actors involved, and the id and description of each actor.
So, if we want to see what happened on that recording for the first collision where the hero actor was colliding with a vehicle, we could use this API:
client.replay_file("col2.log", 13, 0, 122)
We have started the replayer just a bit before the time of the collision, so we can see how it happened. Also, a value of 0 for the duration means to replay all the file (it is the default value).
We can see something like this then:
There is another API to get information about actors that has been blocked by something and can not follow its way. That could be good to find incidences in the simulation. The API is:
client.show_recorder_actors_blocked("recording01.log", min_time, min_distance)
The parameters are:
- min_time: the minimum time that an actor needs to be stopped to be considered as blocked (in seconds).
- min_distance: the minimum distance to consider an actor to be stopped (in cm).
So, if we want to know which actor is stopped (moving less than 1 meter during 60 seconds), we could use something like:
client.show_recorder_actors_blocked("col3.log", 60, 100)
The result can be something like (it is sorted by the duration):
Version: 1
Map: Town05
Date: 02/19/19 15:45:01
Time Id Actor Duration
36 173 vehicle.nissan.patrol 336
75 104 vehicle.dodge_charger.police 295
75 214 vehicle.chevrolet.impala 295
234 76 vehicle.nissan.micra 134
241 162 vehicle.audi.a2 128
302 143 vehicle.bmw.grandtourer 67
303 133 vehicle.nissan.micra 67
303 167 vehicle.audi.a2 66
302 80 vehicle.nissan.micra 67
Frames: 6985
Duration: 374 seconds
This lines tell us when an actor was stopped for at least the minimum time specified. For example the 6th line, the actor 143, at time 302 seconds, was stopped for 67 seconds.
We could check what happened that time with the next API command:
client.replay_file("col3.log", 302, 0, 143)
We see there is some mess there that actually blocks the actor (red vehicle in the image). We can check also another actor with:
client.replay_file("col3.log", 75, 0, 104)
We can see it is the same incidence but from another actor involved (police car).
The result is sorted by duration, so the actor that is blocked for more time comes first. We could check the first line, with Id 173 at time 36 seconds it get stopped for 336 seconds. We could check how it arrived to that situation replaying a few seconds before time 36.
client.replay_file("col3.log", 34, 0, 173)
We can see then the responsible of the incident.
Sample PY scripts to use with the recording / replaying system
There are some scripts you could use:
- start_recording.py: this will start recording, and optionally you can spawn several actors and define how much time you want to record.
- -f: filename of write
- -n: vehicles to spawn (optional, 10 by default)
- -t: duration of the recording (optional)
- start_replaying.py: this will start a replay of a file. We can define the starting time, duration and also an actor to follow.
- -f: filename of write
- -s: starting time (optional, by default from start)
- -d: duration (optional, by default all)
- -c: actor to follow (id) (optional)
- show_recorder_collisions.py: this will show all the collisions hapenned while recording (currently only involved by hero actors).
- -f: filename of write
- -t: two letters definning the types of the actors involved, for example: -t aa
- h = Hero
- v = Vehicle
- w = Walker
- t = Traffic light
- o = Other
- a = Any
- show_recorder_actors_blocked.py: this will show all the actors that are blocked (stopped) in the recorder. We can define the time and distance to be considered as blocked.
- -f: filename of write
- -t: minimum seconds stopped to be considered as blocked (optional)
- -d: minimum distance to be considered stopped (optional)
Python API Reference
!!! important Versions prior to 0.9.0 have a very different API. For the documentation of the stable version please switch to the stable branch.
carla.Client
Client(host, port, worker_threads=0)
set_timeout(float_seconds)
get_client_version()
get_server_version()
get_world()
get_available_maps()
reload_world()
load_world(map_name)
start_recorder(string filename)
replay_file(string filename, float start, float duration, int camera_follow_id)
show_recorder_file_info(string filename)
show_recorder_collisions(string filename, char category1, char category2)
show_recorder_actors_blocked(string filename, float min_time, float min_distance)
apply_batch(commands, do_tick=False)
carla.World
id
debug
get_blueprint_library()
get_map()
get_spectator()
get_settings()
apply_settings(world_settings)
get_weather()
set_weather(weather_parameters)
get_actors()
spawn_actor(blueprint, transform, attach_to=None)
try_spawn_actor(blueprint, transform, attach_to=None)
wait_for_tick(seconds=1.0)
on_tick(callback)
tick()
carla.WorldSettings
synchronous_mode
no_rendering_mode
__eq__(other)
__ne__(other)
carla.DebugHelper
draw_point(location, size=0.1, color=carla.Color(), life_time=-1.0, persistent_lines=True)
draw_line(begin, end, thickness=0.1, color=carla.Color(), life_time=-1.0, persistent_lines=True)
draw_arrow(begin, end, thickness=0.1, arrow_size=0.1, color=carla.Color(), life_time=-1.0, persistent_lines=True)
draw_box(box, rotation, thickness=0.1, color=carla.Color(), life_time=-1.0, persistent_lines=True)
draw_string(location, text, draw_shadow=False, color=carla.Color(), life_time=-1.0, persistent_lines=True)
carla.BlueprintLibrary
find(id)
filter(wildcard_pattern)
__getitem__(pos)
__len__()
__iter__()
carla.ActorBlueprint
id
tags
has_tag(tag)
match_tags(wildcard_pattern)
has_attribute(key)
get_attribute(key)
set_attribute(key, value)
__len__()
__iter__()
carla.ActorAttribute
id
type
recommended_values
is_modifiable
as_bool()
as_int()
as_float()
as_str()
as_color()
__eq__(other)
__ne__(other)
__nonzero__()
__bool__()
__int__()
__float__()
__str__()
carla.ActorList
find(id)
filter(wildcard_pattern)
__getitem__(pos)
__len__()
__iter__()
carla.Actor
id
type_id
parent
semantic_tags
is_alive
attributes
get_world()
get_location()
get_transform()
get_velocity()
get_acceleration()
set_location(location)
set_transform(transform)
set_simulate_physics(enabled=True)
destroy()
carla.Vehicle(carla.Actor)
bounding_box
apply_control(vehicle_control)
get_control()
set_autopilot(enabled=True)
get_physics_control()
apply_physics_control(vehicle_physics_control)
get_speed_limit()
get_traffic_light_state()
is_at_traffic_light()
get_traffic_light()
carla.TrafficLight(carla.Actor)
state
set_state(traffic_light_state)
get_state()
set_green_time(green_time)
get_green_time()
set_yellow_time(yellow_time)
get_yellow_time()
set_red_time(red_time)
get_red_time()
get_elapsed_time()
freeze(True)
is_frozen()
get_pole_index()
get_group_traffic_lights()
carla.Sensor(carla.Actor)
is_listening
listen(callback_function)
stop()
carla.SensorData
frame_number
transform
carla.Image(carla.SensorData)
width
height
fov
raw_data
convert(color_converter)
save_to_disk(path, color_converter=None)
__len__()
__iter__()
__getitem__(pos)
__setitem__(pos, color)
carla.LidarMeasurement(carla.SensorData)
horizontal_angle
channels
raw_data
get_point_count(channel)
save_to_disk(path)
__len__()
__iter__()
__getitem__(pos)
__setitem__(pos, location)
carla.CollisionEvent(carla.SensorData)
actor
other_actor
normal_impulse
carla.LaneInvasionEvent(carla.SensorData)
actor
crossed_lane_markings
carla.GnssEvent(carla.SensorData)
latitude
longitude
altitude
carla.ObstacleDetectionSensorEvent(carla.SensorData)
actor
other_actor
distance
carla.VehicleControl
throttle
steer
brake
hand_brake
reverse
__eq__(other)
__ne__(other)
carla.WheelsPhysicsControl
tire_friction
damping_rate
steer_angle
disable_steering
__eq__(other)
__ne__(other)
carla.VehiclePhysicsControl
torque_curve
max_rpm
moi
damping_rate_full_throttle
damping_rate_zero_throttle_clutch_engaged
damping_rate_zero_throttle_clutch_disengaged
use_gear_autobox
gear_switch_time
clutch_strength
mass
drag_coefficient
center_of_mass
steering_curve
wheels
__eq__(other)
__ne__(other)
carla.Map
name
get_spawn_points()
get_waypoint(location, project_to_road=True)
get_topology()
generate_waypoints(distance)
to_opendrive()
save_to_disk(path=self.name)
carla.Waypoint
transform
is_intersection
lane_width
road_id
lane_id
lane_change
lane_type
next(distance)
get_right_lane()
get_left_lane()
carla.LaneChange
None
Right
Left
Both
carla.WeatherParameters
cloudyness
precipitation
precipitation_deposits
wind_intensity
sun_azimuth_angle
sun_altitude_angle
__eq__(other)
__ne__(other)
Static presets
carla.WeatherParameters.ClearNoon
carla.WeatherParameters.CloudyNoon
carla.WeatherParameters.WetNoon
carla.WeatherParameters.WetCloudyNoon
carla.WeatherParameters.MidRainyNoon
carla.WeatherParameters.HardRainNoon
carla.WeatherParameters.SoftRainNoon
carla.WeatherParameters.ClearSunset
carla.WeatherParameters.CloudySunset
carla.WeatherParameters.WetSunset
carla.WeatherParameters.WetCloudySunset
carla.WeatherParameters.MidRainSunset
carla.WeatherParameters.HardRainSunset
carla.WeatherParameters.SoftRainSunset
carla.Vector3D
x
y
z
__add__(other)
__sub__(other)
__eq__(other)
__ne__(other)
carla.Location
x
y
z
distance(other)
__add__(other)
__sub__(other)
__eq__(other)
__ne__(other)
carla.Rotation
pitch
yaw
roll
get_forward_vector()
__eq__(other)
__ne__(other)
carla.Transform
location
rotation
transform(geom_object)
get_forward_vector()
__eq__(other)
__ne__(other)
carla.BoundingBox
location
extent
__eq__(other)
__ne__(other)
carla.Timestamp
frame_count
elapsed_seconds
delta_seconds
platform_timestamp
__eq__(other)
__ne__(other)
carla.Color
r
g
b
a
__eq__(other)
__ne__(other)
carla.ColorConverter
Raw
Depth
LogarithmicDepth
CityScapesPalette
carla.ActorAttributeType
Bool
Int
Float
RGBColor
carla.TrafficLightState
Red
Yellow
Green
Off
Unknown
carla.LaneMarking
Other
Broken
Solid
module carla.command
carla.command.DestroyActor
actor_id
carla.command.ApplyVehicleControl
actor_id
control
carla.command.ApplyWalkerControl
actor_id
control
carla.command.ApplyTransform
actor_id
transform
carla.command.ApplyVelocity
actor_id
velocity
carla.command.ApplyAngularVelocity
actor_id
angular_velocity
carla.command.ApplyImpulse
actor_id
impulse
carla.command.SetSimulatePhysics
actor_id
enabled
carla.command.SetAutopilot
actor_id
enabled
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Hashes for carla-0.9.4-cp37-cp37m-manylinux1_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6c29bbb16c6b7191e2e96e4b22dddd810cf8a36eca4f7664c6ecbbc933755cbd |
|
MD5 | 01fad3824467d4ef6169d891df795a63 |
|
BLAKE2b-256 | ac83cff7ec98001d9c547e9790e068da718a4c77a7b7591ec5ac42c6e2c541df |
Hashes for carla-0.9.4-cp36-cp36m-manylinux1_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ee1467b94b8c543deeaeabf6493c76474651dfb9620f6e6c7212e4e3455e9409 |
|
MD5 | 4ca73e1a84ee669c9101054b58238a19 |
|
BLAKE2b-256 | 44e0f7277a0e6979295ee575f8034bbfa4e50ae91d3519a24c9f38b420b3ecd4 |
Hashes for carla-0.9.4-cp35-cp35m-manylinux1_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7449fd9bb59fdb3295cd9a7840816962b9a36bf76f8e4866c8a0570fe01411f2 |
|
MD5 | 5cd6cdf74430c75da7abc6bc17c33a10 |
|
BLAKE2b-256 | a816703fadba843fa01b589e37aed20fc523510bbec90d3c2dcbaf8bed2dfd04 |
Hashes for carla-0.9.4-cp34-cp34m-manylinux1_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e62278beddd191d5fcbcc5afa6d2338b55f90d7677c514bcc3c7069e158f43d2 |
|
MD5 | 7f182ee5ccb6065fa1fd566ca2d5f619 |
|
BLAKE2b-256 | f3e7445fcb46689ae29143d8ac17e6402400ef3106e9cb6af3edc8372e193320 |
Hashes for carla-0.9.4-cp27-cp27mu-manylinux1_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 84526dba51c765960601f9583c8bc7116f148e19c6c6c328a2027256f8b92e49 |
|
MD5 | 4febc0355661ec57a3ba1ec1ce391c10 |
|
BLAKE2b-256 | b2b29090df15728c23b80d076ee6ff28ca32cdaaf26dcdf16636038eb6386de4 |