Download OpenAPI specification:Download
This document describes how to get started with the PIX4Dengine Cloud REST API, which provides third party applications access to the Pix4D cloud service. It allows a PIX4Dengine Cloud client to access his/her data on the cloud and perform a range of operations on them.
To register the application and start using the PIX4Dengine Cloud REST API, you will need to acquire a PIX4Dengine Cloud license. Please contact us at https://www.pix4d.com/enterprise-contact to request one.
Once the license has been issued, a client_id/client_secret pair that represents your application is needed, the one that will connect to the API and that you will use to authenticate to the API.
Note that as its name suggests, client_secret is a password identifying the client application and therefore should be handled securely. In order to generate client_id/client_secret, please follow these steps:
Log in to https://cloud.pix4d.com with your Pix4D account.
Go to your Organization dashboard through https://account.pix4d.com, choosing your Organization and going to the Dashboard.
In the API access section, the list of existing keys is displayed (it will be empty the first time a user logs in):
In order to disable existing credentials, they have to be deleted by clicking on Delete.
Equipped with the API client_id/client_secret, the first step is to retrieve an authentication token. An authentication token identifies both the application connecting to the API and the Pix4D user this application is connecting on behalf of.
This token must then be passed along every single request that the application makes to the API. It is passed in the HTTP Authorization header, like so:
Authorization: Bearer <ACCESS_TOKEN>
The PIX4Dengine Cloud REST API uses OAuth 2.0, the industry standard for connecting apps and accounts. OAuth 2.0 supports several "authentication flows" to retrieve an authentication token. Pix4D supports several of them that are each used for a specific purpose, but for PIX4Dengine Cloud customers, only one is relevant: "Client Credentials"
Using this flow, an API client application can get access to its own Pix4D user account (and only to that account). With this method, the authentication is straightforward and only requires the applications' client_id/client_secret pair
The client must send the following HTTP POST request to https://cloud.pix4d.com/oauth2/token/ with a payload containing:
grant_type
: client_credentialsclient_id
: the client ID of the application that was given to you by your Pix4D contactclient_secret
: the client_secret of the application that was given to you by your Pix4D contacttoken_format
: jwtexample:
curl --request POST \
-d "grant_type=client_credentials&token_format=jwt&client_id=YuB7fu…&client_secret=GMSVvt8dF…" \
https://cloud.pix4d.com/oauth2/token/
The response you receive when performing the authentication request above has the following content (In JSON):
access_token
: the token value that you will have to join in all your requeststoken_type
: for Pix4D, this is always a Bearer tokenexpires_in
: the number of seconds the token is valid for. After this number of seconds, API requests using this token will get rejected and you will need to request a new token through the authentication procedure againscope
: describe what the token is valid for. In this case, it's always "read write" since you get a full access to your own account{
"access_token": "<ACCESS_TOKEN>",
"token_type": "Bearer",
"expires_in": 36000,
"scope": "read write"
}
When the token expires, you simply need to perform the above authentication procedure again to get a fresh token.
For an in-depth description of all possible API commands, please refer to the API documentation. This documentation is only available for users who already have an API access.
In this guide, you will discover how to get a token, upload a new project, get it processed, and access the results.
You need a PIX4Dengine Cloud license to authenticate and get access to the API. Please contact your Pix4D reseller to start a trial if you don't have a license already. More information is available on PIX4Dengine Cloud on our product page.
With this license comes your authentication information, Client ID, and Client Secret Key, which are the two values you need in this guide.
This guide can be completed from the Terminal using basic tooling:
curl
command line is used to call the API; it should be included in your OS or docker imageaws-cli
is used to upload and download the data; you can get it directly from AWS: https://aws.amazon.com/cli/We will use a set of photos to be processed in this guide. If you don't have a good dataset at your disposal, feel free to download one of our sample datasets, for example, the building dataset. Select "Download" then "Input Images" from the UI. We consider you are unzipping the archive in ./photos
in this guide.
We are using the OAuth2 Client Credentials flow to generate an access token. Using the Client ID and Client Secret provided with your PIX4Dengine Cloud license, you can get an Access Token using the following curl command:
export PIX4D_CLIENT_ID=__YOUR_CLIENT_ID__
export PIX4D_CLIENT_SECRET=__YOUR_CLIENT_SECRET_KEY__
curl --request POST \
--url https://cloud.pix4d.com/oauth2/token/ \
--form client_id=$PIX4D_CLIENT_ID \
--form client_secret=$PIX4D_CLIENT_SECRET \
--form grant_type=client_credentials \
--form token_type=access_token \
--form token_format=jwt
The response body is a JSON document containing an access_token
attribute:
{
"access_token": "<__PIX4D_ACCESS_TOKEN__>",
"expires_in": 172800,
"token_type": "Bearer",
"scope": "read:cloud write:cloud"
}
For the following example, we will reference the token as PIX4D_ACCESS_TOKEN.
export PIX4D_ACCESS_TOKEN=__PIX4D_ACCESS_TOKEN__
You can learn more about authentication in the reference documentation (see Authentication).
Let's start by creating a project. The only required parameter is the project name
, which can be passed as a JSON payload in the request body.
curl --request POST \
--url https://cloud.pix4d.com/project/api/v3/projects/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json" \
--data '{"name": "My first project"}'
The response will be a 201 CREATED
and the response body will include the project details. Please keep note of the project id and important AWS S3 properties that we will use later.
This is an extract of the interesting JSON properties
{
"id": 877866,
"bucket_name": "prod-pix4d-cloud-default",
"s3_base_path": "user-123123123121312/project-877866"
}
export PROJECT_ID=<THE PROJECT ID>
export S3_BUCKET=prod-pix4d-cloud-default
export S3_BASE_PATH="user-123123123121312/project-877866"
It is recommended to use the shell AWS CLI or the python boto3 library but other tools can work as well. First, we need to retrieve the AWS S3 credentials associated with this project:
curl --url https://cloud.pix4d.com/project/api/v3/projects/$PROJECT_ID/s3_credentials/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json"
The answer contains all the S3 information we need, at least the access_key
, secret_key
, and the session_token
.
{
"access_key": "ASIATOCJLBKSU2CVJIHR",
"secret_key": "5OGGBSvn8Sesdu8l...<remainder of secret key>",
"session_token": "FwoGZXIvYX...<remainder of security token>",
"expiration": "2021-05-10T21:55:47Z",
"bucket": "prod-pix4d-cloud-default",
"key": "user-199a56ab-7ac6-d6d1-4778-5b4d338fc9de/project-883349",
"server_time": "2021-05-19T09:55:47.357641+00:00",
"region": "us-east-1"
}
We can store the S3 credentials in our environment so that they will get picked up by the AWS CLI tool.
export AWS_ACCESS_KEY_ID=ASIATOCJLBKSU2CVJIHR
export AWS_SECRET_ACCESS_KEY='5OGGBSvn8Sesdu8l...<remainder of secret key>'
export AWS_SESSION_TOKEN='FwoGZXIvYX...<remainder of security token>'
Make sure to prefix all your desired destination locations with the path returned in the credentials call. This is the only place for which write access is granted. Make sure that the files are proper images and their names include an extension supported by Pix4D.
Provided your images are located in a folder located at $HOME/images, and that it contains only images, you can run the command:
aws s3 cp ./images/ "s3://${S3_BUCKET}/${S3_BASE_PATH}/" --recursive
If your folder contains multiple images, you need to upload them one by one, which may require the use of a script.
aws s3 cp $HOME/images/ "s3://${S3_BUCKET}/${S3_BASE_PATH}/" --recursive
You then need to register the images in the PIX4Dengine Cloud API so that they will be processed. It is possible to upload files that are not project inputs, and it is not possible to know which files are meant as input. It is therefore required to register each S3 uploaded file in the API. Register the uploaded files you uploaded (single API call):
curl --request POST --url https://cloud.pix4d.com/project/api/v3/projects/$PROJECT_ID/inputs/bulk_register/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json" \
--data "{
\"input_file_keys\": [
\"${S3_BASE_PATH}/P0350035.JPG\",
\"${S3_BASE_PATH}/P0360036.JPG\",
\"${S3_BASE_PATH}/P0370037.JPG\",
\"${S3_BASE_PATH}/P0380038.JPG\",
\"${S3_BASE_PATH}/P0390039.JPG\",
\"${S3_BASE_PATH}/P0400040.JPG\",
\"${S3_BASE_PATH}/P0410041.JPG\",
\"${S3_BASE_PATH}/P0420042.JPG\",
\"${S3_BASE_PATH}/P0430043.JPG\",
\"${S3_BASE_PATH}/P0440044.JPG\"
]
}"
The response should confirm that the various images have been registered amongst other data.
{ "nb_images_registered": 10 }
You are now ready to start processing your project:
curl --request POST --url https://cloud.pix4d.com/project/api/v3/projects/$PROJECT_ID/start_processing/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json"
curl --url https://cloud.pix4d.com/project/api/v3/projects/$PROJECT_ID/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json"
A response can look like:
{
"id": 883349,
"name": "My first project",
"display_name": "My first project",
"project_group_id": null,
"is_demo": false,
"is_geolocalized": true,
"create_date": "2021-05-19T11:53:39.504849+02:00",
"public_status": "PROCESSING",
"display_detailed_status": "Waiting for processing",
"error_reason": "",
"user_display_name": "Jhon Doe",
"project_thumb": "<__URL__>",
"detail_url": "https://cloud.pix4d.com/project/api/v3/projects/883349/",
"acquisition_date": "2021-05-19T11:53:38.973075+02:00",
"project_type": "pro",
"image_count": 10,
"last_datetime_processing_started": "2021-05-19T10:58:10.514800Z",
"last_datetime_processing_ended": null,
"bucket_name": "prod-pix4d-cloud-default",
"s3_bucket_region": "us-east-1",
"s3_base_path": "user-188a56ab-7ac6-d6d1-4778-5b4d338fc9de/project-883349",
"never_delete": false,
"under_trial": false,
"uuid": "239ae97821d54f98975bc0afa2fcc72f",
"coordinate_system": "",
"outputs": {
"mesh": { "texture_res": {} },
"images": {
"project_thumb": {
"status": "processed",
"name": "project_thumb.jpg",
"s3_key": "user-188a56ab-7ac6-d6d1-4778-5b4d338fc9de/project-883349/thumb/project_thumb.jpg",
"s3_bucket": "prod-pix4d-cloud-default"
},
"reflectance": {}
},
"map": { "layers": {}, "bounds": { "sw": [0, 0], "ne": [0, 0] } },
"bundles": {
"inputs": {
"status": "requestable",
"request_url": "https://cloud.pix4d.com/project/api/v3/projects/883349/inputs/zip/"
}
},
"reports": {}
},
"min_zoom": -1,
"max_zoom": -1,
"proj_pipeline": ""
}
Once the status changes from PROCESSING
to DONE
, the project's main outputs are ready to be retrieved.
Once the project is processed, by querying its details, it is possible to:
This guide describes all of the different ways to process a project with PIX4Dengine Cloud API. Once the project has been created, in order to process it, the following endpoint must be called:
POST on https://cloud.pix4d.com/project/api/v3/projects/{id}/start_processing/ See the full documentation for this endpoint.
The body request includes:
{
"tags": ["string"]
}
Depending on the type of processing, not all of the parameters in the request body are required. There are different types of processing:
Unless explicitly specified, processing types are mutually exclusive.
A faster processing pipeline for nadir datasets with the newest algorithms that yields better results and has better management of coordinate systems. It supports vertical coordinate systems over an ellipsoid, geoid model, or user-defined constant geoid undulation. This pipeline also produces a 3D mesh with improved visualization in the PIX4Dcloud viewer.
This is the default pipeline if nothing is specified.
It can produce the following outputs:
Similarly to the other pipelines, it can be selected by using nadir
(or equivalently 3d-maps
) in the tags
parameter of the processing options payload:
{
"tags": [
"nadir"
]
}
Enable specific settings for a particularly flat scene, such as a field with few to no vertical structures, such as trees or buildings. To enable those settings, use the flat
tag in addition to the above nadir
one in the tags parameters of the processing options payload:
{
"tags": [
"nadir",
"flat"
]
}
Note that this tag is only valid for nadir processing pipelines (meaning together with nadir
or 3d-maps
tags), oblique nor building processing pipelines.
A faster processing pipeline for oblique datasets with the newest algorithms that yields better results and has better management of coordinate system (e.g. it supports vertical coordinate systems: ellipsoidal, geoid or user-defined constant geoid undulation). This pipeline also produces a 3D mesh with improved visualization in the PIX4Dcloud viewer.
It can produce the following outputs:
Similarly to the other pipelines, it can be selected by using oblique
in the tags
parameter of the processing options payload:
{
"tags": [
"oblique"
]
}
The API provides a third processing pipeline for images (and depth data optionally) captured by the PIX4Dcatch. This photogrammetric pipeline is optimized for this type of data and produces better and faster results. It can be used with images, or if your mobile device is equipped with a LiDAR scanner, the pipeline will use both images and depth data in the process. The outputs generated after processing are Orthophoto, DSM, point cloud, and 3D mesh. This pipeline is automatically selected when using images captured with Pix4Dcatch.
The LiDAR scanner captures depth information during the image acquisition. These LiDAR points will compensate for the lack of 3D points over reflective and low-texture surfaces.
More information on combining photogrammetry and LiDAR can be found in this article: https://www.pix4d.com/blog/lidar-photogrammetry
For images captured from oblique flights around targets with featureless facades, such as walls of a uniform color and texture, the building reconstruction pipeline can provide higher-quality results than standard processing would otherwise.
This pipeline can be selected by passing building
in the tags
parameter:
{
"tags": [
"building"
]
}
In the event of processing failure, an error code* and reason are given in the project details API response. Error codes are intended to be machine-readable, while error reasons are human-readable messages to aid in debugging.
The following table describes the possible error codes and their corresponding reasons:
Code | Reason | Potential Mitigation |
---|---|---|
10001 | An unexpected error occurred. | |
10002 | Processing exceeded allotted time. | |
10003 | Processing exceeded available resources. | |
10101 | Failed to create cameras. More information will be available in the processing log. | Refer to the processing log for additional details. |
10201 | Failed to calibrate a sufficient number of cameras. | Verify the image quality and overlap. |
10402 | Point cloud generation failed. | Verify the image quality and overlap. |
10410 | Processing failed. | |
10411 | Failed to densify sparse point cloud. | Verify the quality of the calibration. |
10501 | 3D textured mesh generation failed. | Ensure that the dense point cloud consists of a single block. |
10601 | The input data is not valid. | |
10611 | The selected output CRS is not isometric. | Use a valid isometric CRS. |
10612 | The selected output CRS is not projected or arbitrary. | Use a valid projected or arbitrary CRS. |
10613 | An output CRS cannot be defined without a horizontal CRS component. | Use a valid CRS with an horizontal component. |
10614 | A geoid model or geoid height cannot be specified without a CRS vertical component. | Remove the geoid model or add a vertical component to the CRS. |
10615 | A geoid height cannot be specified with a geoid model. | Remove the geoid height or add a geoid model to the CRS. |
Here we will guide you through creating a "Site" to group Projects into a timeline view.
The primary use case for this is to group Projects of the same location, created over a time period, for example to track a building construction project.
In the Cloud Frontend you can compare these projects easily to see the differences between any of the processed assets in both 2D and 3D views.
POST on *https://cloud.pix4d.com/project/api/v3/project-groups/
specifying a name
and setting the project_group_type
to bim
.
curl --request POST --url https://cloud.pix4d.com/project/api/v3/project_groups/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json"
--data '{"name": "Construction site 123", "project_group_type": "bim"}`
This will return the information about the Site (ProjectGroup
) you have created, including it's id
.
{
"name": "Construction site 123",
"id": 112233,
...
}
To assign a Project to a Site you can PUT
the project
into the project_group
(a.k.a Site) using
the move_batch
endpoint. You will need to supply the Organization's uuid
in the owner_uuid
field.
Note: using the PATCH
method on the Project
detail endpoint (project/api/v3/projects/<id>/
) to move projects
is deprecated and will be removed at some point in Q3 2025.
curl --request PUT \
--url https://cloud.pix4d.com/common/api/v4/drive/move_batch/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header 'content-type: application/json' \
--data '{
"owner_uuid": "58101312-de35-4a22-a951-943eba20a041",
"source_nodes": [
{
"type": "project",
"uuid": "0c26579b-f2ad-4dfb-a253-a4c43dbbbf53"
}
],
"target_type": "project_group",
"target_uuid": "50df7f65-f480-4fcc-9f86-32d8d7724689"
}'
To un-assign a Project from a Site you can PUT
the project
into the organization
(or even a folder
) using
the move_batch
endpoint. You will need to supply the Organization's uuid
in the owner_uuid
field.
Note: using the PATCH
method on the Project
detail endpoint (project/api/v3/projects/<id>/
) to move projects
is deprecated and will be removed at some point in Q3 2025.
curl --request PUT \
--url https://cloud.pix4d.com/common/api/v4/drive/move_batch/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header 'content-type: application/json' \
--data '{
"owner_uuid": "58101312-de35-4a22-a951-943eba20a041",
"source_nodes": [
{
"type": "project",
"uuid": "0c26579b-f2ad-4dfb-a253-a4c43dbbbf53"
}
],
"target_type": "organization",
"target_uuid": "58101312-de35-4a22-a951-943eba20a041"
}'
The images which are uploaded for processing can have two different coordinate systems:
The image tags include latitude, longitude and ellipsoidal height with respect to WGS84 and will be taken automatically by the software. In that case, the input coordinate system is set to WGS84.
The images do not have any geolocation and then the input coordinate system is set to arbitrary.
Any other input coordinate system is not supported.
It is the coordinate system to which the outputs will refer to.
When nothing is specified, the output coordinate system is set up by default:
If the input coordinate system is WGS84, the default output coordinate system will be WGS84 / UTM XX. XX or the UTM zone depends on the position of the images
If the input coordinate system is arbitrary, the output coordinate system will also be arbitrary
It is possible to define an output coordinate system when the project is created:
POST on https://cloud.pix4d.com/project/api/v3/projects/
One of the parameters of the request body is coordinate_system
which can be either:
{
"coordinate_system": "string"
}
The vertical input coordinate system will be ellipsoidal
The vertical output coordinate system will always be EGM 96 Geoid.
spatialreference.org hosts a database of EPSG registered coordinate systems which should cover most needs related to horizontal CS. For example, to select the Swiss coordinate system CH1903, one would search for it on the database and export in OGC WKT format to get the WKT string as expected by PIX4D:
PROJCS["CH1903 / LV03",
GEOGCS["CH1903",
DATUM["CH1903",
SPHEROID["Bessel 1841",6377397.155,299.1528128,
AUTHORITY["EPSG","7004"]],
AUTHORITY["EPSG","6149"]],
PRIMEM["Greenwich",0,
AUTHORITY["EPSG","8901"]],
UNIT["degree",0.0174532925199433,
AUTHORITY["EPSG","9122"]],
AUTHORITY["EPSG","4149"]],
PROJECTION["Hotine_Oblique_Mercator_Azimuth_Center"],
PARAMETER["latitude_of_center",46.9524055555556],
PARAMETER["longitude_of_center",7.43958333333333],
PARAMETER["azimuth",90],
PARAMETER["rectified_grid_angle",90],
PARAMETER["scale_factor",1],
PARAMETER["false_easting",600000],
PARAMETER["false_northing",200000],
UNIT["metre",1,
AUTHORITY["EPSG","9001"]],
AXIS["Easting",EAST],
AXIS["Northing",NORTH],
AUTHORITY["EPSG","21781"]]
Coordinate systems must comply with the WKT standards:
Learn how to use Ground Control Points (GCP) or Manual Tie Points (MTP) and Check Points in the computation.
A Ground Control Point (GCP) is a characteristic point which coordinates are known. The coordinates have been measured with traditional surveying methods or have been obtained by other sources (LiDAR, older maps of the area, Web Map Service). GCPs are used to georeference a project and reduce the noise.
A Manual Tie Point (MTP) is a characteristic point for which the coordinates are not known, but which is visible and accurately identifiable from several images, e.g. the corner of a wall. This is used to help the photogrammetry process to join the images in the scene.
The use of GCPs is possible with all PIX4D engines.
Once the project has been created and before processing it, it is possible to pass GCP coordinates which will be used in the computation:
POST on /project/api/v3/projects/{id}/gcp/register ({id} is the project ID)
Request body is as follows:
{
"gcps": [
{
"name": "GCP_123",
"point_type": "CHECKPOINT",
"x": 1.23,
"y": 45.2,
"z": 445.87,
"xy_accuracy": 0.02,
"z_accuracy": 0.02
},
{...}
]
}
The coordinates of a GCP are:
Although most coordinate system are defined as above, there are some cases where the orientations of the axes are different. As an example, a coordinate system in Japan and another one in South Africa:
MTPs can be created similarly to GCPs, but omitting the georeferencing, and through a different endpoint:
POST on /project/api/v3/projects/{id}/mtp/ ({id} is the project ID)
Request body is as follows:
{
"mtps": [
{
"name": "MTP_456",
"is_checkpoint": false,
},
{...}
]
}
point_type
. This article explains the difference between both.Once the GCP or MTP data has been registered, it is necessary to also pass the marks of each GCP/MTP, in other words the pixel coordinates of the GCPs/MTPs in each of the images.
For GCP and MTP Marks the gcp
field is used for the name
of the GCP or MTP.
POST on /project/api/v3/projects/{id}/mark/register/ ({id} is the project ID)
Request body is as follows:
{
"marks": [
{
"gcp": "GCP_123",
"photo": "user-123/project-354/my_file.jpg",
"x": 1.23,
"y": 45.2,
},
{...}
]
}
s3_key
where the GCP/MTP has been marked (each GCP/MTP must be marked in at least two photos)In order to mark the GCP/MTP in the images, it is possible to use PIX4Dmatic or other third party applications.
This guide describes the specific case of using Pix4D AutoGCP to automatically mark GCPs and process a project with PIX4Dengine Cloud API.
This allows you to simply upload GCP coordinates and images, and the system will take care of marking GCP targets in the images. If you have manually generated the marks data then see this article on how to use that data directly.
Creating the project and uploading images is the same as in other examples.
For information on how to optimally set out your marks on your survey site see this article.
This is the same as in other examples.
You must also set the coordinate_system
when creating the project.
This should be the same coordinate system as your GCPs.
This is as described in the GCPs
Call the start processing endpoint as with all processing.
Processing with AutoGCP will take some extra time, due to the extra compute required to analyse and mark the GCPs on the input images.
This feature allows the user to define a region of interest, which means no reconstructions will be created outside the defined area when processing a project with PIX4Dengine Cloud API.
Once the project has been created, in order to define the region of interest, the following endpoint must be called:
POST on https://cloud.pix4d.com/project/api/v3/projects/{id}/processing_options/ See the full documentation for this endpoint.
The body request includes:
{
"area": {
"plane": {
"vertices3d": [
[
"float",
"float",
"float"
],
[
"float",
"float",
"float"
],
[
"float",
"float",
"float"
],
[
"float",
"float",
"float"
]
],
"outer_boundary": [
"int",
"int",
"int",
"int"
],
},
"thickness": "float"
}
}
The only required field to set a region of interest is the plane
, which consists of:
vertices3d
defines a list of 3D locations in WGS 84. For now, the altitude or z
value of the location is not considered and the areas defined will be applied only in the 2D plane.outer_boundary
defines the order each location stored in vertices3d
must be considered when drawing the area.There is also an optional field available inside plane
named inner_boundaries
to define areas inside the main area (defined with the outer_boundary
) to be excluded from processing.
Finally, the thickness
field is defined as a limit distance from the plane in the normal direction. If not specified, it is assumed to be infinite (usual case when limiting the region of interest).
{
"area": {
"plane": {
"vertices3d": [
[
3.248295746230309,
43.415212850276255,
0
],
[
3.2484144251306066,
43.41525557252694,
0
],
[
3.248465887662594,
43.415162880462645,
0
],
[
3.2483382815883806,
43.415126642785765,
0
]
],
"outer_boundary": [
0,
1,
2,
3
]
}
}
}
{
"area": {
"plane": {
"vertices3d": [
[
3.248295746230309,
43.415212850276255,
0
],
[
3.2484144251306066,
43.41525557252694,
0
],
[
3.248465887662594,
43.415162880462645,
0
],
[
3.2483382815883806,
43.415126642785765,
0
],
[
3.248345633378664,
43.415194540731015,
0
],
[
3.248360336959232,
43.415163261911765,
0
],
[
3.248417050769994,
43.41518500450734,
0
]
],
"outer_boundary": [
0,
1,
2,
3
],
"inner_boundaries": [
[
4,
5,
6
]
]
}
}
}
A volume can be computed for a given project available on PIX4Dcloud and it is computed between a base surface boundary and the terrain surface.
The base surface boundary is given as a set of vertex coordinates and defines the base plane for the volume calculation.
Before running this computation, make sure that:
PROCESSED
stateTo compute the volume, send a POST request to https://api.webgis.pix4d.com/v1/project/{id}/volumes/.
The payload body fields are contained in the list below.
The payload must be in the same coordinate system and the same units as the project.
Parameter | Type |
---|---|
base_surface | String |
coordinates | Array[Array[Number]] |
custom_elevation | Number |
Base surface
This parameter allows to select the base plane for the volume calculation. Accepted values are:
When using custom, the custom_elevation parameter is required (see below).
More information about the different base surfaces at Menu View > Volumes > Sidebar > Objects
Coordinates
Each set of coordinates refers to a vertex of the boundary and they must be given with respect to the output project coordinate system and in the same units as the projects (meters, feet or US foot).
custom_elevation
Optional. The elevation MUST be provided when the base_surface is set to custom. If custom_elevation is provided, only X,Y vertex coordinates are needed, the Z is the custom_elevation.
units
Optional. If not provided, the preferred units will be used to calculate the volume
Accepted values are:
m
(Metres)yd
(Yards)ydUS
(US Survey Yards)The response parameters will be in the same coordinate system and same units and the same units as the project. If the project is in meters, the volume will be computed in m³. If the project is in feet or US foot, the volume will be computed in yd³.
Parameter | Type |
---|---|
cut | Number |
cut_error | Number |
fill | Number |
fill_error | Number |
cut
The volume that is above the volume base. The volume is measured between the volume base and the surface defined by the DSM.
cut_error
Error estimation of the cut volume.
fill
The volume that is below the volume base. The volume is measured between the volume base and the surface defined by the DSM.
fill_error
Error estimation of the fill volume.
If the request is successful, the body includes volumes and error estimations. The order of the response body fields doesn’t matter and is not guaranteed.
{
"cut": Number,
"cut_error": Number,
"fill": Number,
"fill_error": Number
}
{
"message": "Network error communicating with endpoint"
}
Cause | Likely a networking issue either in the API Gateway, or any server that failed to route the request properly to the next point |
Solution | Resend the request |
{
"message": "Endpoint request timed out"
}
Cause | The request took more than 29 seconds to compute. One of the reasons might be that too many vertices to describe the polygon boundary are sent |
Solution | Reduce the number of the polygon vertices |
{
"title": "404 Not Found"
}
Cause | The project doesn’t exist |
Solution | Check that the project exists |
{
"title": "No COG DSM found for this project"
}
Cause | The project doesn’t have a DSM. There may also be a delay between registering a DSM and the volume calculation being available, as PIX4Dcloud creates the COG DSM |
Solution | Check that the project has a DSM. Allow some time for the system to register the DSM and create the COG DSM |
{
"title": "The polygon doesn't overlap the dataset"
}
Cause | The vertices of the defined polygon lie outside the DSM |
{
"title": "Invalid Polygon"
}
Cause | The interpolation of elevations inside the polygon boundary failed |
{
"title": "Too much data requested"
}
Cause | The DSM doesn’t have a high enough overview level, as a result it is not possible to extract the maximum amount of data that is defined by the API |
Solution | A possible solution could be to reduce the size of the area for which the volume needs to be computed. If it is not a suitable solution or the issue persists contact the Support |
For a complete demonstration of a volume computation, see this Jupiter Notebook.
The following examples show the request body to POST to https://api.webgis.pix4d.com/v1/project/{id}/volumes/
{
"base_surface": "triangulated",
"coordinates": [
[328726.692, 4688271.030, 159.725],
[328728.351, 4688298.208, 150.376],
[328750.527, 4688289.860, 150.250],
[328744.430, 4688271.645, 150.056]
]
}
{
"base_surface": "triangulated",
"coordinates": [
[419430.327, 3469059.806],
[419429.338, 3469056.812],
[419431.319, 3469060.816],
[419433.329, 3469058.803]
],
"custom_elevation": 40.52
}
Note: The coordinates of the vertices are given with respect to the output project coordinate system and in the same units as the projects (meters, feet or US foot).
Use PIX4Dcatch to capture your scene.
Once the capture is complete, select "Export all data" to generate the ZIP file.
This is the same as in other examples.
Instead of uploading images, upload the ZIP file exported from PIX4Dcatch to S3.
aws s3 cp inputs.zip "s3://${S3_BUCKET}/${S3_BASE_PATH}/"
Then, register the ZIP file in place of the images.
curl --request POST \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json" \
--data \
"{
\"input_file_keys\": [
\"${S3_BASE_PATH}/inputs.zip\"
]
}" \
https://cloud.pix4d.com/project/api/v3/projects/$PROJECT_ID/inputs/bulk_register/
The ZIP file exported from PIX4Dcatch will be compatible with PIX4Dcloud.
If you need to modify the ZIP file's contents however, there are certain things you should be aware of.
manifest.json
manifest.json
itselfmanifest.json
must conform to the JSON schemaOptionally you could also pass the list of GCPs and Marks in the zip contents with respect to the JSON schema defined for it.
The input_control_points.json
file must be compliant to below rules.
id
for each GCP must be unique.definition
property in the CRS should have:Authority:code+code
where the first code is for a 2D CRS and the second one if for a vertical CRS (e.g. EPSG:4326+5773
).Authority:code+Auhority:code
where the first code is for a 2D CRS and the second one if for a vertical CRS (e.g. EPSG:4326+EPSG:5773
).Authority:code
where the code is for a 2D or 3D CRS (e.g. EPSG:4326
).geoid_height
can be passed. Please note that geoid
is not supported in the current schema.coordinate_systems
, it will give an validation error.If any of these requirements are not met then your project will not process and will be marked as errored.
The example of the zip file layout will look like:
├── input_control_points.json
├── logs
│ ├── log.json
│ └── test
│ └── abc.txt
├── manifest.json
├── new_sample
│ └── new
│ └── sample_image4.jpg
├── sample
│ └── sample_image3.jpg
├── sample_image1.jpg
└── sample_image2.jpg
The manifest.json
from this zip file will contain:
{
"inputs": [
"sample_image1.jpg",
"sample_image2.jpg",
"sample/sample_image3.jpg",
"new_sample/new/sample_image4.jpg"
],
"log_files": [
"logs/log.json",
"logs/test/abc.txt"
],
"input_control_points": "input_control_points.json"
}
and the input_control_points.json
from this zip file will have the list of all GCPs and the Marks as below.
{
"format": "application/opf-input-control-points+json",
"version": "0.2",
"gcps": [
{
"id": "gcp0",
"geolocation": {
"crs": {
"definition": "EPSG:4265+EPSG:5214",
"geoid_height": 14
},
"coordinates": [
1,
2,
3
],
"sigmas": [
5,
5,
10
]
},
"marks": [
{
"photo": "sample_image1.jpg",
"position_px": [
458,
668
],
"accuracy": 1.0
}
],
"is_checkpoint": true
}
],
"mtps": []
}
Once the project has been processed, a user can retrieve various data which are stored on the servers.
GET on https://cloud.pix4d.com/project/api/v3/projects/{id}/outputs/
The response body will include the s3_key
and s3_bucket
of all of the output types. An example bellow shows the output point_cloud
:
{
"result_type": "point_cloud",
"output_type": "point_cloud",
"availability": "done",
"s3_key": "user-123123123123123123123123/project-741581/test_3dmaps/2_densification/point_cloud/test_3dmaps_group1_densified_point_cloud.las",
"s3_bucket": "prod-pix4d-cloud-default",
"s3_region": "us-east-1",
"output_id": 147817230
}
A list of the main outputs which can be obtained is shown below:
Result type | Output type | Description |
---|---|---|
ortho | ortho | Transparent orthomosaic in TIFF format |
ortho | ortho_rgba_bundle | ZIP file containing transparent orthomosaic in TIFF format, .prj and .tfw files |
ortho | ortho_rgb | Opaque orthomosaic in TIFF format |
ortho | ortho_rgb_bundle | ZIP file containing opaque orthomosaic in TIFF format, .prj and .tfw files |
ortho | ortho_cloud_optimized | Ortho in Cloud-Optimized Geotiff format |
dsm | dsm | Digital Surface Model (DSM) in TIFF format |
dsm_cloud_optimized | dsm_cloud_optimized | DSM in Cloud-Optimized Geotiff format |
point_cloud | point_cloud | Generated point cloud in LAS or LAZ format |
3d_mesh_obj | 3d_mesh_obj_zip | ZIP file containing .obj, .mtl, and .jpg files |
3d_mesh_obj | 3d_mesh_fbx | 3Dmesh in FBX format |
3d_mesh_obj | b3dm_js | 3Dmesh in Cesium format (index file) |
ndvi | ndvi | Generated NDVI layer in TIFF format |
quality_report | quality_report | PDF file with information about the process |
xml_quality_report | xml_quality_report | Quality Report in XML format |
mapper_log | mapper_log | Log file of the process in text format |
opf_project | opf_project | OPF project document |
Notes: Depending on the processing options and type of processing used, some outputs might not be generated.
GET on https://cloud.pix4d.com/project/api/v3/projects/{id}/photos/
The response body will include the s3_key
and s3_bucket
of all of the input images. An example bellow shows the image IMG_4082.JPG:
{
"id": 165965565,
"s3_key": "user-105e0ece-f221-467e-bab0-de5fbf004b61/project-741581/images/IMG_4082.JPG",
"thumbs_s3_key": {
"legacy_png_512": "user-105e0ece-f221-467e-bab0-de5fbf004b61/project-741581/photo_thumbnails/images/IMG_4082_thumb.png"
},
"s3_bucket": "prod-pix4d-cloud-default",
"width": 4000,
"height": 3000,
"excluded_from_mapper": null
}
It is recommended to use the shell AWS CLI or the python boto3 library but other tools can work as well. First, retrieve the AWS S3 credentials associated with this project:
GET on https://cloud.pix4d.com/project/api/v3/projects/{ID}/s3_credentials/
The response contains all the S3 information we need, at least the access_key, secret_key and the session_token.
{
"access_key": "ASIATOCJLBKSU2CVJIHR",
"secret_key": "5OGGBSvn8Sesdu8l...<remainder of the secret key>",
"session_token": "FwoGZXIvYX...<remainder of security token>",
"expiration": "2021-05-10T21:55:47Z",
"bucket": "prod-pix4d-cloud-default",
"key": "user-199a56ab-7ac6-d6d1-4778-5b4d338fc9de/project-883349",
"server_time": "2021-05-19T09:55:47.357641+00:00",
"region": "us-east-1"
}
The S3 credentials can be stored in our environment, so that they will get picked up by the AWS CLI tool.
export AWS_ACCESS_KEY_ID=ASIATOCJLBKSU2CVJIHR
export AWS_SECRET_ACCESS_KEY='5OGGBSvn8Sesdu8l...<remainder of the secret key>'
export AWS_SESSION_TOKEN='AQoDYXdzEJr...<remainder of security token>'
export AWS_REGION='us-east-1'
Once the credentials have been set, the copy command can be run to get an specific file using its unique "s3_key" and "s3_bucket":
aws s3 cp s3://${S3_BUCKET}/${S3_KEY} ./
As an example, in order to get the point cloud from the example above, install AWS CLI and run the following:
aws s3 cp s3://prod-pix4d-cloud-default/user-123123123123123123123123/project-741581/test_3dmaps/2_densification/point_cloud/test_3dmaps_group1_densified_point_cloud.laz ./
It would copy the test_3dmaps_group1_densified_point_cloud.laz file from AWS S3 to your working directory.
This is the Open Photogrammetry Format from processing the project on PIX4Dcloud.
This can be read by PIX4Dmatic, and also using tooling such as pyopf.
The output type opf_project
points to the project.opf
top level file. This file then references all the
other files in the project with relative paths, allowing you to download them as needed, using the same
S3 credentials for a Project as in the above examples.
Once you have downloaded the project.opf
you can use pyopf
to discover the files it references:
from pyopf import io
from pyopf import project as opf
from pyopf import resolve
index_file = "/some/directory/structure/project.opf"
pix4d_project: opf.Project = io.load(index_file)
references = [resource.uri for item in pix4d_project.items for resource in item.resources]
objects = resolve.resolve(pix4d_project)
images = [camera.uri for camera_list in objects.camera_list_objs for camera in camera_list.cameras]
print("Prepend the following with your Project S3 prefix and download them:")
print(objects)
print(images)
Notes:
You can then open the project.opf
in PIX4Dmatic or perform further analysis on the OPF documents.
GET on https://cloud.pix4d.com/project/api/v3/projects/{id}/inputs/zip/
An email will be sent containing a URL to download a ZIP file with all of the input images.
GET on https://cloud.pix4d.com/project/api/v3/projects/{id}/outputs/zip/
An email will be sent containing a URL to download a ZIP file with all of the outputs.
This feature allows embeding the editor in a iframe, (both 2D and 3D view). Please contact your sales representative to enable this feature. The user must specify the domain(s) where the PIX4Dcloud editor iframe will be embedded.
The feature works only with shared sites and datasets (information about share links can be found in this support article). It means that first, it is necessary to generate a token for your project (dataset) or project group (site).
Example request:
curl --request POST \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: **SECRET**' \
--data \
'{ \
"enabled": true, \
"write": false, \
"type": "Project", \
"type_id": 1234 \
}' \
https://cloud.pix4d.com/common/api/v3/permission-token/
Example response:
{
"token": "d8dfbe6d-93b1-4bca-af40-5c469e3530da",
"enabled": true,
"write": false,
"type": "Project",
"type_id": 1234,
"creation_date": "2021-06-25T10:17:06.734910+02:00"
}
Example request:
curl --request POST \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: **SECRET**' \
--data \
'{ \
"enabled": true, \
"type": "Project", \
"type_id": 1234 \
}' \
https://cloud.pix4d.com/common/api/v3/permission-token/
Example response:
{
"token": "05a114e1-804f-4e6c-a094-5d3eb80d2119",
"enabled": true,
"write": true,
"type": "Project",
"type_id": 1234,
"creation_date": "2021-06-25T10:17:06.734910+02:00"
}
The generated token will have a shape of an uuid v4 (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)
Note: Revoking (disabling) the share link, will also disable your embedded editor. To do that, set 'enabled' to 'false'
curl --request PATCH \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: **SECRET**' \
--data \
'{ \
"enabled": false, \
"type": "Project", \
"type_id": 1234 \
}' \
https://cloud.pix4d.com/common/api/v3/permission-token/
Embedding should be done through an iframe tag (HTML iframe tag)
As an example:
<iframe
src="https://embed.pix4d.com/cloud/default/dataset/256164/map?shareToken=b77fa15a-14ff-4c6e-b7eb-da05bad16bb2&lang=fr&theme=light"
referrerpolicy="strict-origin-when-cross-origin"
allow="geolocation"
frameborder="0"
width="100%"
height="100%"
allowfullscreen
></iframe>
src
an embed url to your site/dataset. In “URL templates” section you’ll learn how to build different urls for your users
referrerpolicy="strict-origin-when-cross-origin"
it’s a minimal permission that lets us determine if white label is allowed to be displayed in the domain
allow="geolocation"
in order to let users use the geolocation feature of the editor.
frameborder
, width
, height
recommended setup to make the editor seem seamless.
https://embed.pix4d.com/cloud/default/dataset/:dataset_id:/:view?:?shareToken={share_token}
dataset_id
id of the dataset
view?
optional parameter that can be either map or model.
share_token
the generated share token in the “Manage share link” step
https://embed.pix4d.com/cloud/default/site/:site_id:/dataset/:dataset_id:/:view?:?shareToken=:share_token
site_id
: id of the site
dataset_id
: id of the dataset
view?
: optional parameter that can be either map or model.
map
forces map viewmodel
forces model view.
share_token
the generated share token in the “Manage share link” stepBy defining additional query parameters, you can also define language and a theme
theme
can be either light
or dark
. By default it’s dark.
lang
en-US
or ja
or ko
or it
or es
. By default it’s en-US.
For example, to get the embedded editor in french with them set to light, it would be:
https://embed.pix4d.com/cloud/default/dataset/256164/map?shareToken=b77fa15a-14ff-4c6e-b7eb-da05bad16bb2&lang=fr&theme=light
To test your local website before publishing to your domain, you can run up a server that is on
any port and known locally to your machine as localhost
.
For example, if you have the iframe
HTML code from the example above in an index.html
file,
then could run the python command to bring up a minimal webserver:
python -m http.server 8000
Then in your local web browser navigate to localhost:8000/index.html
you can test the embed view.
Opening the index.html
file without going through a webserver on localhost
will fail with a
403: You are not authorized to access this website
due to the Referer
header not specifying an
allowed domain.
Only localhost:*
is available to new clients to test.
If you wish to deploy it to your own domain you must reach out to your Sales representative in PIX4D and supply the domain names you plan to host the embed on (for example domains for staging and production systems, so you can test before deploying), otherwise you will receive a 403 error.
The Annotation API allows for programmatic management of annotations, this allows the API to user to do the following operations:
curl --location --request POST 'https://api.webgis.pix4d.com/v1/annotations/' \
--header 'Authorization: Bearer <insert JWT here>'
--header 'Content-Type: application/json' \
--data-raw '{
"annotations": [
{
"entity_type": "Project",
"entity_id": 123456,
"properties": {
"name": "My annotation",
"color": "#FFFFFF80",
"description": "My first annotation"
},
"geometry": {
"coordinates": [
2.38,
57.322,
0
],
"type": "Point"
}
},
{
"entity_type": "Project",
"entity_id": 123456,
"properties": {
"name": "My annotation",
"color": "#FFFFFF80",
"description": "My second annotation"
},
"geometry": {
"coordinates": [
3.49,
68.433,
0
],
"type": "Point"
}
}
]
}'
The successful response will have 201 CREATED
code and return annotation id in the body. E.g.
A successful response will have a status of 201 CREATED
and a body similar to what is listed below.
{
"annotations": [
{
"annotation_id": "Project_123456_62063898-531b-4389-93f9-ed5126338ff3",
"success": true
},
{
"annotation_id": "Project_697180_82f70b94-773c-47db-80dd-5576e569548f",
"success": true
}
]
}
The example above will have created two point in the project coordinate system, to visualise the annotation go to Pix4dCloud.
curl --location --request GET 'https://api.webgis.pix4d.com/v1/annotations/?entity_type=Project&entity_id=697180' \
--header 'Authorization: Bearer <INSERT_JWT_HERE>'
A successful response will look something like this:
{
"results": [
{
"version": "1.0",
"entity_id": 697180,
"entity_type": "Project""id": "Project_697180_31c1e01d-39a9-4926-9054-87934cee3c69",
"created": "2022-05-17T08:16:49.669183+00:00",
"modified": "2022-05-17T08:16:49.669186+00:00",
"tags": [ ...
],
"geometry": { ...
},
"properties": {
"visible": true,
"camera_position": [ ...
]
"description": "Description",
"volume": { ...
},
"color_fill": "#00224488",
"name": "Annotation 0",
"color": "#00224488"
},
"extension": { ...
},
},
{
"version": "1.0",
"entity_id": 697180,
"entity_type": "Project""id": "Project_697180_4c8ac186-5427-46b5-8347-c1ee374fd10f",
"created": "2022-05-17T08:16:49.670391+00:00",
"modified": "2022-05-17T08:16:49.670393+00:00",
"tags": [ ...
],
"geometry": { ...
},
"properties": {
"visible": true,
"camera_position": [ ...
],
"description": "Description",
"volume": { ...
},
},
"extension": { ...
}
},
]
}
curl --location --request DELETE 'https://api.webgis.pix4d.com/v1/annotations/' \
--header 'Authorization: Bearer <INSERT JWT HERE>' \
--header 'Content-Type: application/json' \
--data-raw \
'{
"annotations": [
"Project_123456_12345678-1234-1234-1234-123456789abc",
"Project_654321_fedcba98-fedc-fedc-fedc-fedcba987654",
]
}'
A successful response will look something like this:
{
"annotations": [
{
"success": true,
"annotation_id": "Project_123456_12345678-1234-1234-1234-123456789abc",
},
{
"success": true,
"annotation_id": "Project_654321_fedcba98-fedc-fedc-fedc-fedcba987654",
}
]
}
Here we will guide you through using Folders to organize your Datasets (Projects) and Sites (ProjectGroups), then moving resources within the organization, and finally navigating the resulting "resource tree" by listing or searching content.
Folders can be used, not only to organize content, but also to organize permissions as access to a Folder (also Projects and Project Groups) can be limited to certain users.
POST to *https://cloud.pix4d.com/common/api/v4/folders/
specifying a name
and a parent using parent_type
and parent_uuid
. parent_type
must be one of organization
or folder
...
curl --request POST \
--url https://cloud.pix4d.com/common/api/v4/folders/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header "Content-Type: application/json"
--data '{
"name": "Customer ABC",
"parent_uuid": "f6dfd584-c4a2-43ae-9e61-672b7d1f5058",
"parent_type": "organization"
}'
This will return the information about the Folder you have created, including it's uuid
.
{
"uuid": "65f7d6ab-2e46-4b30-8a3a-38fdaf54307a",
"name": "Customer ABC"
}
To rename a Folder you can PATCH
it to update the name
field.
curl --request PATCH \
--url https://cloud.pix4d.com/common/api/v4/folders/65f7d6ab-2e46-4b30-8a3a-38fdaf54307a/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header 'content-type: application/json' \
--data '{
"name": "Customer ABC Ltd"
}'
The return value confirms the updated name.
{
"name": "Customer ABC Ltd"
}
To remove a Folder you can DELETE
. The Folder will be deleted along with all its descendants in the resource
tree.
curl --request DELETE \
--url https://cloud.pix4d.com/common/api/v4/folders/65f7d6ab-2e46-4b30-8a3a-38fdaf54307a/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}"
To create a Project (or Project Group) inside a Folder, simply add a parent_type
and parent_uuid
to the body of the
request.
curl --request POST \
--url https://cloud.pix4d.com/project/api/v3/projects/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header 'content-type: application/json' \
--data '{
"name": "Project 6",
"parent_uuid": "65f7d6ab-2e46-4b30-8a3a-38fdaf54307a",
"parent_type": "folder"
}'
Once you have created Folders within your organization you may want to move existing Projects and Project Groups (or other Folders) into them. To do this you can POST to https://cloud.pix4d.com/common/api/v4/drive/move_batch/
curl --request PUT \
--url https://cloud.pix4d.com/common/api/v4/drive/move_batch/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}" \
--header 'content-type: application/json' \
--data '{
"owner_uuid": "58101312-de35-4a22-a951-943eba20a041",
"source_nodes": [
{
"type": "project_group",
"uuid": "d9c5bec2-e11f-4371-8640-3d1534e8e3b2"
},
{
"type": "folder",
"uuid": "bee32b9e-360c-404b-81d3-7a44ff88eb66"
},
{
"type": "project",
"uuid": "1ec0ba95-b16a-4ee5-9353-511ae7e46778"
}
],
"target_type": "folder",
"target_uuid": "65f7d6ab-2e46-4b30-8a3a-38fdaf54307a"
}'
Now you have Projects and Project Groups arranged within Folders, you will find it useful to list these resources according to their location. The list endpoint of the Drive allows you to list resources within a given Folder or those at the "root" of the organization.
curl --request GET \
--url 'https://cloud.pix4d.com/common/api/v4/drive/folder/e8f6731b-dc00-4e27-afb9-2740fb76c843/' \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}"
The paginated response will look like this:
{
"count": 4,
"next": null,
"previous": null,
"results": [
{
"legacy_id": 97389,
"uuid": "72cd5e09-38a6-4490-b8e7-97486f01e45d",
"type": "folder",
"date": "2025-02-21T13:45:24.978372+01:00",
"name": "Vancouver",
"metadata": null
},
{
"legacy_id": 97388,
"uuid": "a2c58c0f-861f-4240-b646-f35290fc81eb",
"type": "folder",
"date": "2025-02-21T13:45:00.906934+01:00",
"name": "Toronto",
"metadata": null
},
{
"legacy_id": 440328,
"uuid": "50df7f65-f480-4fcc-9f86-32d8d7724689",
"type": "project_group",
"date": "2025-03-06T17:25:46.890181+01:00",
"name": "Credit counter",
"metadata": null
},
{
"legacy_id": 1005051,
"uuid": "0c26579b-f2ad-4dfb-a253-a4c43dbbbf53",
"type": "project",
"date": "2025-03-06T17:23:43.384397+01:00",
"name": "CreditCounter",
"metadata": null
}
]
}
You can also search for resources within an Organization whose name matches a string.
curl --request GET \
--url 'https://cloud.pix4d.com/common/api/v4/drive/organization/58101312-de35-4a22-a951-943eba20a041/search/?q=inside' \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}"
The paginated response is the same format as for the list endpoint:
{
"count": 3,
"next": null,
"previous": null,
"results": [
{
"legacy_id": 97389,
"uuid": "72cd5e09-38a6-4490-b8e7-97486f01e45d",
"type": "folder",
"date": "2025-02-21T13:45:24.978372+01:00",
"name": "Vancouver",
"metadata": null
},
{
"legacy_id": 438767,
"uuid": "baf79c5a-c402-45c9-9bf8-411eed93507c",
"type": "project_group",
"date": "2025-02-21T13:46:02.684781+01:00",
"name": "Vancouver 2",
"metadata": null
},
{
"legacy_id": 438766,
"uuid": "65a6914f-8db8-4ba3-ac31-4d5e388beee3",
"type": "project_group",
"date": "2025-02-21T13:45:52.247435+01:00",
"name": "Vancouver 1",
"metadata": null
}
]
}
Given a particular resource (identified by its type and uuid), you can also retrieve its "path" in the resource tree, which can be used to build breadcrumbs.
curl --request GET \
--url https://cloud.pix4d.com/common/api/v4/drive/project_group/baf79c5a-c402-45c9-9bf8-411eed93507c/path/ \
--header "Authorization: Bearer ${PIX4D_ACCESS_TOKEN}"
A successful response will be in the following format:
{
"path": [
{
"uuid": "baf79c5a-c402-45c9-9bf8-411eed93507c",
"type": "project_group",
"name": "Vancouver 2"
},
{
"uuid": "72cd5e09-38a6-4490-b8e7-97486f01e45d",
"type": "folder",
"name": "Vancouver"
},
{
"uuid": "0cda2c1a-e52f-4c54-aec6-e827d09c232a",
"type": "folder",
"name": "Canada"
}
],
"owner": {
"type": "organization",
"uuid": "58101312-de35-4a22-a951-943eba20a041",
"name": "Customer ABC Ltd"
},
"more_ancestors": false
}
List projects the user can access.
A public_status
field is provided to clients. It can take the values CREATED
,
UPLOADED
, PROCESSING
, DONE
, ERROR
.
A more detailed status can be found in the display_detailed_status
field, but is
only for display, not for logic control as it reflects an internal status the name
and flow of which might change.
It is possible to filter projects returned by their public status using query parameters:
public_status
for a status to includepublic_status_exclude
for a status to excludeMultiple values can be used by providing the parameter multiple times with different values.
The fields id
, name
, display_name
can be used in a similar filter/exclude
fashion.
For example, if you don't want the demo project, you can set ?is_demo=false
There are 2 ways to serialize a project;
By default, the list
REST action uses the simple serializer while retrieve
uses the
detailed one. You can override the serializer used by passing a serializer
query
parameter with value simple
or detailed
.
page | integer A page number within the paginated result set. |
page_size | integer Number of results to return per page. |
{- "count": 123,
- "results": [
- {
- "embed_urls": {
- "property1": "string",
- "property2": "string"
}, - "error_reason": "string",
- "error_code": 0,
- "last_datetime_processing_started": "2019-08-24T14:15:22Z",
- "last_datetime_processing_ended": "2019-08-24T14:15:22Z",
- "s3_bucket_region": "string",
- "never_delete": true,
- "under_trial": true,
- "source": "string",
- "owner_uuid": "string",
- "credits": 0,
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}, - "public_share_token": "c827b6b5-2f34-47ee-824e-a48b2ab6b708",
- "public_status": "string",
- "detail_url": "string",
- "image_count": 0,
- "public_url": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "user_display_name": "string",
- "project_type": "pro",
- "project_group_id": 0,
- "create_date": "2019-08-24T14:15:22Z",
- "name": "string",
- "bucket_name": "string",
- "project_thumb": "string",
- "front_end_public_group_url": "string",
- "front_end_public_url": "string",
- "is_demo": true,
- "display_detailed_status": "string",
- "coordinate_system": "string"
}
]
}
Create an empty project.
name
is a mandatory parameter and is limited to 100 characters (cannot contain slashes,
must not start with a dash and cannot end with whitespace).
project_type
and acquisition_date
are optional.
project_type
can take the following values : pro
, bim
, model
, ag
.
If project_type
is not provided it defaults to the preferred solution of the user as
defined in his profile
billing_model
is one of CLOUD_STANDARD
, CRANE
, ENGINE_CLOUD
or INSPECT
.
CRANE
is accepted only if the project is created through a crane interface on an account
that has a CRANE licenseENGINE_CLOUD
is accepted only if the project is created through a public API interface
on an account that has an ENGINE_CLOUD licenseINSPECT
is accepted only if the project is created through an Inspect interface on an
account that has a INSPECT licenseCLOUD_STANDARD
is accepted only if the user has a valid license with cloud allowanceWill return a 400 if the billing model is unknown or invalid for the user.
If acquisition_date
is not provided (in iso-8601
format), it will default to the current
time
The coordinate system is compliant with Open Photogrammetry Format specification (OPF).
If passed, coordinate_system
is either:
EPSG:21781
)In addition the following values are accepted for arbitrary coordinate systems:
ARBITRARY_METERS
for the software to use an arbitrary default coordinate
system in metersARBITRARY_FEET
for the software to use an arbitrary default coordinate
system in feetARBITRARY_US_FEET
for the software to use an arbitrary default coordinate
system in us survey feetIf coordinate_system
is passed, two optionals fields can be added:
coordinate_system_geoid_height
as Float to define the constant geoid height
over the underlying ellipsoid in the units of the vertical CRS axis.coordinate_system_extensions
as JSON object with extension-specific objects.
(for more information: https://pix4d.github.io/opf-spec/specification/control_points.html#crs)Will return with a 400 error code if the coordinate_system
is invalid.
Unsupported cases:
In addition please note that to process with PIX4Dmapper:
Organization Management users should specify the 'parent' of the project by passing:
parent_type
: one of: organization, projectgroup or folder.parent_uuid
: The uuid of the parent.processing_email_notification
an optional parameter, it defaults to true
, setting it
to false
will disable all email notifications related to project processing events
(e.g. start of processing, end of processing etc.).
name required | string [ 1 .. 100 ] characters |
project_type | string (SolutionEnum) Enum: "pro" "bim" "ag" "model" "inspection"
|
acquisition_date | string <date-time> |
project_group_id | integer or null |
billing_model | string (BillingModelEnum) Enum: "CLOUD_STANDARD" "ENGINE_CLOUD" "CRANE" "INSPECT"
|
parent_id | string |
owner_uuid | string or null <uuid> |
(OwnerTypeCe4Enum (string or null)) or (BlankEnum (any or null)) or (NullEnum (any or null)) | |
processing_email_notification | boolean Default: true |
parent_uuid | string <uuid> |
parent_type | string (ProjectCreatorParentTypeEnum) Enum: "organization" "user" "project_group" "folder"
|
coordinate_system | string or null |
coordinate_system_geoid_height | number or null <double> |
coordinate_system_extensions | any or null |
{- "name": "string",
- "project_type": "pro",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "project_group_id": 0,
- "billing_model": "CLOUD_STANDARD",
- "parent_id": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "processing_email_notification": true,
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "coordinate_system_geoid_height": 0,
- "coordinate_system_extensions": null
}
{- "id": 0,
- "name": "string",
- "project_type": "pro",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "project_group_id": 0,
- "billing_model": "CLOUD_STANDARD",
- "parent_id": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "proj_pipeline": "string",
- "processing_email_notification": true,
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}
}
Get a project.
User required to have access to the project.
A public_status
field is provided to clients. It can take the values CREATED
,
UPLOADED
, PROCESSING
, DONE
, ERROR
.
A more detailed status can be found in the display_detailed_status
field, but is
only for display, not for logic control as it reflects an internal status the name
and flow of which might change.
There are 2 ways to serialize a project;
By default, the list
REST action uses the simple serializer while retrieve
use the
detailed one. You can override the serializer used by passing a serializer
query
parameter with value simple
or detailed
.
Notes
min_zoom
and max_zoom
properties found in the object are deprecated. Please
rather use the min_zoom
and max_zoom
properties that are provided for each of the
map layers.display_user_name
is an empty string when the owner_uuid
represents an
organization rather than an individual user. See the RESTful API documentation for
POST /project/api/v3/projects/
for more details about owner_uuid
.id required | integer A unique integer value identifying this project. |
{- "embed_urls": {
- "property1": "string",
- "property2": "string"
}, - "error_reason": "string",
- "error_code": 0,
- "last_datetime_processing_started": "2019-08-24T14:15:22Z",
- "last_datetime_processing_ended": "2019-08-24T14:15:22Z",
- "s3_bucket_region": "string",
- "never_delete": true,
- "under_trial": true,
- "source": "string",
- "owner_uuid": "string",
- "credits": 0,
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}, - "public_share_token": "c827b6b5-2f34-47ee-824e-a48b2ab6b708",
- "public_status": "string",
- "detail_url": "string",
- "image_count": 0,
- "public_url": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "user_display_name": "string",
- "project_type": "pro",
- "project_group_id": 0,
- "create_date": "2019-08-24T14:15:22Z",
- "name": "string",
- "bucket_name": "string",
- "project_thumb": "string",
- "front_end_public_group_url": "string",
- "front_end_public_url": "string",
- "is_demo": true,
- "display_detailed_status": "string",
- "coordinate_system": "string",
- "outputs": "string",
- "min_zoom": -2147483648,
- "max_zoom": -2147483648,
- "proj_pipeline": "string"
}
Update the project.
The following attributes will be updated:
The coordinate system is compliant with Open Photogrammetry Format specification (OPF).
If passed, coordinate_system
is either:
EPSG:21781
)In addition the following values are accepted for arbitrary coordinate systems:
ARBITRARY_METERS
for the software to use an arbitrary default coordinate
system in metersARBITRARY_FEET
for the software to use an arbitrary default coordinate
system in feetARBITRARY_US_FEET
for the software to use an arbitrary default coordinate
system in us survey feetIf coordinate_system
is passed, two optionals fields can be added:
coordinate_system_geoid_height
as Float to define the constant geoid height
over the underlying ellipsoid in the units of the vertical CRS axis.coordinate_system_extensions
as JSON object with extension-specific objects.
(for more information: https://pix4d.github.io/opf-spec/specification/control_points.html#crs)Will return with a 400 error code if the coordinate_system
is invalid.
Unsupported cases:
In addition please note that to process with PIX4Dmapper:
id required | integer A unique integer value identifying this project. |
never_delete | boolean |
owner_uuid | string non-empty |
acquisition_date | string <date-time> |
is_geolocalized | boolean |
s3_base_path | string |
display_name | string [ 1 .. 100 ] characters |
project_group_id | integer or null |
coordinate_system | string or null |
coordinate_system_geoid_height | number or null <double> |
coordinate_system_extensions | any or null |
min_zoom | integer [ -2147483648 .. 2147483647 ] |
max_zoom | integer [ -2147483648 .. 2147483647 ] |
{- "never_delete": true,
- "owner_uuid": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "project_group_id": 0,
- "coordinate_system": "string",
- "coordinate_system_geoid_height": 0,
- "coordinate_system_extensions": null,
- "min_zoom": -2147483648,
- "max_zoom": -2147483648
}
{- "embed_urls": {
- "property1": "string",
- "property2": "string"
}, - "error_reason": "string",
- "error_code": 0,
- "last_datetime_processing_started": "2019-08-24T14:15:22Z",
- "last_datetime_processing_ended": "2019-08-24T14:15:22Z",
- "s3_bucket_region": "string",
- "never_delete": true,
- "under_trial": true,
- "source": "string",
- "owner_uuid": "string",
- "credits": 0,
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}, - "public_share_token": "c827b6b5-2f34-47ee-824e-a48b2ab6b708",
- "public_status": "string",
- "detail_url": "string",
- "image_count": 0,
- "public_url": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "user_display_name": "string",
- "project_type": "pro",
- "project_group_id": 0,
- "create_date": "2019-08-24T14:15:22Z",
- "name": "string",
- "bucket_name": "string",
- "project_thumb": "string",
- "front_end_public_group_url": "string",
- "front_end_public_url": "string",
- "is_demo": true,
- "display_detailed_status": "string",
- "coordinate_system": "string",
- "outputs": "string",
- "min_zoom": -2147483648,
- "max_zoom": -2147483648,
- "proj_pipeline": "string"
}
List the project depth data
Retrieve all the depth data associated with the project.
id required | string^[0-9a-fA-F-]+$ |
{- "id": 0,
- "photo": 0,
- "depth_map_confidence": "string",
- "depth_map": "string"
}
List the project extras.
Returns the list of the project's registered extras (p4d, masks, ...). Each file is a hash that contains several access information.
id required | string^[0-9a-fA-F-]+$ |
{- "file_key": "string"
}
Register a project extra.
Requests to register a file as a project extra.
id required | string^[0-9a-fA-F-]+$ |
file_key required | string non-empty |
{- "file_key": "string"
}
{- "file_key": "string"
}
List the project GCPs.
Return the list of the project's GCPs. In case of the CS being EPSG:4326, x is read as the longitude and y the latitude.
{
"gcps": [
{
"id": 123,
"project": 456,
"name": "GCP_123",
"point_type": "CHECKPOINT",
"x": 1.23,
"y": 45.2,
"z": 445.87,
"xy_accuracy": 0.02,
"z_accuracy": 0.02
}
]
}
In case of error, nothing is registered
id required | string^[0-9a-fA-F-]+$ |
{- "id": 0,
- "project": 0,
- "name": "string",
- "point_type": "GCP",
- "x": 0,
- "y": 0,
- "z": 0,
- "xy_accuracy": 0,
- "z_accuracy": 0
}
Update the project GCP.
gcp_name required | string^\w+$ |
id required | string^[0-9a-fA-F-]+$ |
name | string [ 1 .. 200 ] characters |
point_type | string (PointTypeEnum) Enum: "GCP" "CHECKPOINT"
|
x | number <double> |
y | number <double> |
z | number <double> |
xy_accuracy | number or null <double> |
z_accuracy | number or null <double> |
{- "name": "string",
- "point_type": "GCP",
- "x": 0,
- "y": 0,
- "z": 0,
- "xy_accuracy": 0,
- "z_accuracy": 0
}
{- "id": 0,
- "project": 0,
- "name": "string",
- "point_type": "GCP",
- "x": 0,
- "y": 0,
- "z": 0,
- "xy_accuracy": 0,
- "z_accuracy": 0
}
Create project GCPs in bulk.
The GCP name must be unique for this project. The project must have a coordinate system defined. The GCPs will be read in this coordinate system. Since EPSG:4326 is not supported as a project coordinate system, the GCPs cannot be given in this system (degrees) either.
The project id is taken from the captured URL argument.
Point type is one of [GCP | CHECKPOINT]
Takes a list of gcps like so
{
"gcps": [
{
"name": "GCP_123",
"point_type": "CHECKPOINT",
"x": 1.23,
"y": 45.2,
"z": 445.87,
"xy_accuracy": 0.02,
"z_accuracy": 0.02
},
{...}
]
}
id required | string^[0-9a-fA-F-]+$ |
required | Array of objects (GCPRequest) |
{- "gcps": [
- {
- "name": "string",
- "point_type": "GCP",
- "x": 0,
- "y": 0,
- "z": 0,
- "xy_accuracy": 0,
- "z_accuracy": 0
}
]
}
{- "gcps": [
- {
- "id": 0,
- "project": 0,
- "name": "string",
- "point_type": "GCP",
- "x": 0,
- "y": 0,
- "z": 0,
- "xy_accuracy": 0,
- "z_accuracy": 0
}
]
}
Register project inputs.
Registers the list of inputs for the project. Inputs are the images and possibly a
p4d file. You can pass the images also in zip format with or without depth_map
and depth_map_confidence
files.
e.g.
{
"input_file_keys": [
"user-123/project-123/.../images.zip"
]
}
It's possible to register images with depth data using inputs
key.
This endpoint can be called several times for a single project.
If you are directly uploading images, to avoid timeout issues due to some processing necessary at image registration, it is advised to register images in batches of 500 images or less. Using this API with more than 500 images might work, but there is no guarantee and we will force it to return an empty images attribute in the response payload to keep it bearable
input_file_keys
must be an array of strings, each being the full s3 key of the file
e.g.
{
"input_file_keys": [
"user-123/project-123/.../IMG_xxx1.jpg",
"user-123/project-123/.../IMG_xxx2.jpg"
]
}
inputs
must be an array of dictionaries, each containing photo and its assets. The
values are s3 keys of the corresponding files. For the full list of photo assets see
project.constants.INPUT_TYPES.
Depth data is registered only in the case when both depth_map
and depth_map_confidence
are present! In other cases, depth data input is ignored.
e.g.
{
"inputs": [
{
"photo": "user-123/project-123/.../Image_xxx1.jpg",
"depth_map_confidence": "user-123/project-123/.../Confidence_xxx1.tiff",
"depth_map": "user-123/project-123/.../DepthMap_xxx1.tiff"
}
]
}
File keys must be valid s3 keys. Therefore:
user-xxx/project-xxx/
that was returned in the
credentials request (any other prefix should have failed when you tried to put the files on
s3)Input files must be valid image files (either passed directly or in zip format) supported by PIX4Dmapper software, named with the proper extension
Returns
The thumbnail of the image takes time to generate and therefore the thumbnail link returned might return a 404 for a while before the thumbnail is actually there
In case of uploading images in zip format, this input type will be considered as an extra and will return count extra_files_registered + 1. It will not count the number of images passed in zip bundle.
If project processing was already triggered before calling the endpoint, the inputs are not registered and the endpoint returns 400 - Bad Request.
e.g.
{
"nb_images_registered": 2,
"extra_files_registered": 0,
"images": [
{
"id": 23167077,
"temp_url": "https://s3.amazonaws.com/test.pix4d.com/user-123/project-345/potatoes/IMG_170328_124606_0223_RED.TIF?AWSAccessKeyId=AKIAJ4X7DJRPQPFCIMOQ&Signature=slJpeyo0r5Pammg%2FWU61fSdu9hU%3D&Expires=1502786097",
"s3_key": "user-123/project-345/potatoes/IMG_170328_124606_0223_RED.TIF",
"file_size": null,
"thumb_s3_key": null,
"thumb_url": null,
"exif": {},
"s3_bucket": "test.pix4d.com"
},
{
"id": 23167078,
"temp_url": "https://s3.amazonaws.com/test.pix4d.com/user-123/project-345/potatoes/IMG_170328_124604_0224_RED.TIF?AWSAccessKeyId=AKIAJ4X7DJRPQPFCIMOQ&Signature=DmbjLn6IplbV%2Fb8GgyLOCXIJOEk%3D&Expires=1502786097",
"s3_key": "user-123/project-345/potatoes/IMG_170328_124604_0224_RED.TIF",
"file_size": null,
"thumb_s3_key": null,
"thumb_url": null,
"exif": {},
"s3_bucket": "test.pix4d.com"
}
],
"p4d_registered": false,
"nb_image_signatures_registered": 0,
"nb_depth_data_registered": 0
}
id required | string^[0-9a-fA-F-]+$ |
input_file_keys | Array of strings[ items [ 1 .. 1024 ] characters ] |
Array of objects (ProjectInputRequest) |
{- "input_file_keys": [
- "string"
], - "inputs": [
- {
- "photo": "string",
- "depth_map_confidence": "string",
- "depth_map": "string"
}
]
}
{- "input_file_keys": [
- "string"
], - "inputs": [
- {
- "photo": "string",
- "depth_map_confidence": "string",
- "depth_map": "string"
}
]
}
Get the project inputs as archive.
Retrieve the url to download the input zip containing images and p4d file.
Returns
id required | string^[0-9a-fA-F-]+$ |
{- "embed_urls": {
- "property1": "string",
- "property2": "string"
}, - "error_reason": "string",
- "error_code": 0,
- "last_datetime_processing_started": "2019-08-24T14:15:22Z",
- "last_datetime_processing_ended": "2019-08-24T14:15:22Z",
- "s3_bucket_region": "string",
- "never_delete": true,
- "under_trial": true,
- "source": "string",
- "owner_uuid": "string",
- "credits": 0,
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}, - "public_share_token": "c827b6b5-2f34-47ee-824e-a48b2ab6b708",
- "public_status": "string",
- "detail_url": "string",
- "image_count": 0,
- "public_url": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "user_display_name": "string",
- "project_type": "pro",
- "project_group_id": 0,
- "create_date": "2019-08-24T14:15:22Z",
- "name": "string",
- "bucket_name": "string",
- "project_thumb": "string",
- "front_end_public_group_url": "string",
- "front_end_public_url": "string",
- "is_demo": true,
- "display_detailed_status": "string",
- "coordinate_system": "string",
- "outputs": "string",
- "min_zoom": -2147483648,
- "max_zoom": -2147483648,
- "proj_pipeline": "string"
}
List the project marks.
{
"marks": [
{
"id": 130521,
"gcp": "GCP_123",
"gcp_id": 96120,
"photo": "user-123/project-354/my_file.jpg",
"x": 1.23,
"y": 45.2
}
]
}
gcp
: name/label of the tie-point (GCP or MTP) corresponding to the mark.gcp_id
: ID of the GCP/MTP corresponding to the mark.x
, y
: location of the mark in the photo.photo
: name of the photo that contains the mark.id required | string^[0-9a-fA-F-]+$ |
{- "id": 0,
- "photo": "string",
- "gcp": "string",
- "gcp_id": 0,
- "x": 0,
- "y": 0
}
Update the project mark.
Update a previously registered mark. You can't change the GCP or the photo the mark is registered on. Trying to do so will be a no-op. If you need to do that, you'll have to delete the mark and create a new one.
id required | string^[0-9a-fA-F-]+$ |
mark_id required | string^[0-9]+$ |
x | number <double> |
y | number <double> |
{- "x": 0,
- "y": 0
}
{- "id": 0,
- "photo": "string",
- "gcp": "string",
- "gcp_id": 0,
- "x": 0,
- "y": 0
}
Create project marks in bulk.
Create Marks associated with the project. The Mark photo and GCP must exist and be defined
on the same project.
Note that gcp
may refer to any type of tie-points, i.e a GCP or MTP.
Takes a list of marks with photo being the s3_key of an image of the project, gcp the name of a GCP of the project and x/y being the positive pixel coordinates of the GCP inside the photo, with respect to the top-left corner
{
"marks": [
{
"gcp": "GCP_123",
"photo": "user-123/project-354/my_file.jpg",
"x": 1.23,
"y": 45.2,
},
{...}
]
}
A GCP can be marked only once on a given photo
In case of errors, nothing is registered.
If the error is due to photo(s) and/or gcp(s) that we can't find, those are returned with the format:
{"photos": [], "gcps": []}
If the error is due to an attempt to register a GCP twice on a photo we return
{"detail": "Attempt to create existing mark(s): [['photo', 'gcp'], [...]]"}
id required | string^[0-9a-fA-F-]+$ |
required | Array of objects (MarkRequest) |
{- "marks": [
- {
- "x": 0,
- "y": 0
}
]
}
{- "marks": [
- {
- "id": 0,
- "photo": "string",
- "gcp": "string",
- "gcp_id": 0,
- "x": 0,
- "y": 0
}
]
}
Create project MTPs in bulk.
MTP names must be unique per project.
id required | string^[0-9a-fA-F-]+$ |
required | Array of objects (MTPRequest) |
{- "mtps": [
- {
- "name": "string",
- "is_checkpoint": false,
- "x": 0,
- "y": 0,
- "z": 0
}
]
}
{- "mtps": [
- {
- "name": "string",
- "project": 0,
- "id": 0,
- "is_checkpoint": false,
- "x": 0,
- "y": 0,
- "z": 0
}
]
}
Update a project MTP.
id required | string^[0-9a-fA-F-]+$ |
mtp_id required | string^[\d]+$ |
name | string [ 1 .. 200 ] characters |
is_checkpoint | boolean Default: false |
x | number or null <double> |
y | number or null <double> |
z | number or null <double> |
{- "name": "string",
- "is_checkpoint": false,
- "x": 0,
- "y": 0,
- "z": 0
}
{- "name": "string",
- "project": 0,
- "id": 0,
- "is_checkpoint": false,
- "x": 0,
- "y": 0,
- "z": 0
}
List the project outputs.
Returns a list of all the project's outputs -- files built from the processing results.
id required | string^[0-9a-fA-F-]+$ |
{- "output": "string",
- "output_type": "ortho_rgba_bundle"
}
Register a project output
Output is the file path. It is expected to be the full s3 key (starts with user-123/project-456)
Passing an output_type is optional. If one is passed it must be valid. If none is passed, the type is derived from the file path.
If an output of the same type already existed, the new output replaces the old one.
NOTE : If you are uploading 3D object/texture/material then upload all three in the same request through the bulk endpoint NOT THIS ENDPOINT to ensure downstream items that require all three are consistent.
Returns
id required | string^[0-9a-fA-F-]+$ |
output required | string [ 1 .. 1024 ] characters |
output_type | string (OutputOutputTypeEnum) Enum: "ortho_rgba_bundle" "ortho_rgb_bundle" "gaussian_splatting" "gaussian_splatting_display" "gaussian_splatting_potree" "3d_mesh_draco" "3d_mesh_fbx" "3d_mesh_mat" "3d_mesh_obj" "3d_mesh_obj_zip" "3d_mesh_slpk" "3d_mesh_texture_2048" "3d_mesh_texture_8192" "3d_mesh_texture_16384" "3d_mesh_texture" "3d_mesh_thumb" "b3dm_js" "b3dm_js_identity" "calibrated_camera_parameters" "calibrated_camera_parameters_json" "calibrated_external_camera_parameters" "calibrated_external_camera_parameters_error" "calibrated_external_camera_parameters_error_json" "cell_tower_analytics" "contour_files_bundle" "dsm_bundle" "dsm_cloud_optimized" "dsm_cloud_optimized_display" "dsm" "dsm_metadata" "dsm_preview" "dsm_thumb" "dtm_bundle" "dtm" "gcp_quality_report" "ifc" "input_zip" "inspect_report_json" "inspect_report_pdf" "project_report_json" "project_report_pdf" "i_construction" "log_file_bundle" "mapper_log" "ndvi" "ndvi_metadata" "ndvi_thumb" "ortho_cloud_optimized" "ortho_cloud_optimized_display" "ortho" "ortho_metadata" "ortho_rgb" "ortho_thumb" "point_cloud_bundle" "point_cloud_gltf" "point_cloud" "potree_js" "potree_metadata" "point_cloud_slpk" "project_offset" "project_thumb" "project_wkt" "project_zip" "quality_report" "reflectance_green" "reflectance_maps_bundle" "reflectance_nir" "reflectance_red" "reflectance_reg" "xml_quality_report" "opf_project" "calibrated_cameras" "input_cameras" "opf_scene_ref_frame" "json_quality_report"
|
{- "output": "string",
- "output_type": "ortho_rgba_bundle"
}
{- "output": "string",
- "output_type": "ortho_rgba_bundle"
}
Given project_id and output_id, delete an output of the project.
Returns
id required | string^[0-9a-fA-F-]+$ |
output_id required | string^[0-9]+$ |
Register project outputs in bulk
Output is the file path. It is expected to be the full s3 key (starts with user-123/project-456).
Passing an output_type is optional. If one is passed it must be valid. If none is passed, the type is derived from the file path.
If an output of the same type already existed, the new output replaces the old one.
outputs
is a list of output specifiers.
An output consists of:
output
the path on S3output_type
(optional) see note aboveFor example:
{
"outputs": [
{
"output": "some/s3/key/file1.ext"
},
{
"output": "some/s3/key/file2.ext",
"output_type": "some_type"
}
]
}
Returns
id required | string^[0-9a-fA-F-]+$ |
required | Array of objects (OutputRequest) non-empty |
{- "outputs": [
- {
- "output": "string",
- "output_type": "ortho_rgba_bundle"
}
]
}
{- "outputs": [
- {
- "output": "string",
- "output_type": "ortho_rgba_bundle"
}
]
}
Get the project outputs as archive.
Retrieve the url to download the project zip containing almost everything.
Returns
id required | string^[0-9a-fA-F-]+$ |
{- "embed_urls": {
- "property1": "string",
- "property2": "string"
}, - "error_reason": "string",
- "error_code": 0,
- "last_datetime_processing_started": "2019-08-24T14:15:22Z",
- "last_datetime_processing_ended": "2019-08-24T14:15:22Z",
- "s3_bucket_region": "string",
- "never_delete": true,
- "under_trial": true,
- "source": "string",
- "owner_uuid": "string",
- "credits": 0,
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}, - "public_share_token": "c827b6b5-2f34-47ee-824e-a48b2ab6b708",
- "public_status": "string",
- "detail_url": "string",
- "image_count": 0,
- "public_url": "string",
- "acquisition_date": "2019-08-24T14:15:22Z",
- "is_geolocalized": true,
- "s3_base_path": "string",
- "display_name": "string",
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "user_display_name": "string",
- "project_type": "pro",
- "project_group_id": 0,
- "create_date": "2019-08-24T14:15:22Z",
- "name": "string",
- "bucket_name": "string",
- "project_thumb": "string",
- "front_end_public_group_url": "string",
- "front_end_public_url": "string",
- "is_demo": true,
- "display_detailed_status": "string",
- "coordinate_system": "string",
- "outputs": "string",
- "min_zoom": -2147483648,
- "max_zoom": -2147483648,
- "proj_pipeline": "string"
}
List the project photos.
Returns the paginated list of the project's registered photos. Each photo is a map that contains several access information
You can use a photo_ids
query param that is a comma separated list of photo ids if you
need the details of a known subset of photos
You can pass an ordering
query parameter to specify on which field the results should be
ordered
Supported fields for ordering:
excluded_from_mapper
is a flag used for photos that are part of the project but not taken
into account in the photogrammetry processing. A value of true
means that they are indeed
not considered for processing while a value of false
or null
means that they behave as
default photos, i.e. are including in photogrammetry
Sample response
{
"count": 12,
"next": "https://....",
"previous": "https://...",
"results": [
{
"id": 123,
"s3_key": "foo/bar/baz.png",
"thumbs_s3_key": {
"small": "foo/bar/baz_thumb_1.jpg"
},
"s3_bucket": "bucket_name",
"width": 125,
"height": 156,
"excluded_from_mapper": null
}
]
}
id required | string^[0-9a-fA-F-]+$ |
Get the project photo
excluded_from_mapper
is a flag used for photos that are part of the project but not taken
into account in the photogrammetry processing. A value of true
means that they are indeed
not considered for processing while a value of false
or null
means that they behave as
default photos, i.e. are including in photogrammetry
Sample response
{
"id": 123,
"s3_key": "foo/bar/baz.png",
"thumbs_s3_key": {
"small": "foo/bar/baz_thumb_1.jpg"
},
"s3_bucket": "bucket_name",
"width": 125,
"height": 156,
"excluded_from_mapper": null
}
id required | string^[0-9a-fA-F-]+$ |
photo_id required | string^[0-9a-f-]+$ |
{- "id": 0,
- "s3_key": "string",
- "thumbs_s3_key": "string",
- "s3_bucket": "string",
- "width": 0,
- "height": 0,
- "excluded_from_mapper": true
}
Update the project photo
excluded_from_mapper
is a flag used for photos that are part of the project but not taken
into account in the photogrammetry processing. A value of true
means that they are indeed
not considered for processing while a value of false
or null
means that they behave as
default photos, i.e. are including in photogrammetry
Sample response
{
"id": 123,
"s3_key": "foo/bar/baz.png",
"thumbs_s3_key": {
"small": "foo/bar/baz_thumb_1.jpg"
},
"s3_bucket": "bucket_name",
"width": 125,
"height": 156,
"excluded_from_mapper": null
}
id required | string^[0-9a-fA-F-]+$ |
photo_id required | string^[0-9a-f-]+$ |
excluded_from_mapper | boolean or null |
{- "excluded_from_mapper": true
}
{- "id": 0,
- "s3_key": "string",
- "thumbs_s3_key": "string",
- "s3_bucket": "string",
- "width": 0,
- "height": 0,
- "excluded_from_mapper": true
}
Get XMP of the photo
Returns the XMP of the photo, if any data is available.
XMP data is not following any spec. Therefore each constructor stores data differently. Currently we support Parrot Anafi and DJI (with Gimbal)
Note that the XMP data contains float angles in degrees, as string. They can be prefixed by a '+' or '-' sign (or no sign). The list of attributes is not guaranteed, meaning you can have from 0 to multiple. This is only an example of some attributes:
{
"xmp": {
"yaw": "-6.00",
"pitch": "+7.123440",
"roll": "0.00"
}
}
id required | integer A unique integer value identifying this project. |
photo_id required | string^[0-9a-f-]+$ |
Get the project processing options.
RESTful interaction with the project's processing options. A shortcut to define processing
options is to pass them to the start_processing
endpoint.
Retrieve the processing options.
This endpoint returns processing options in the output, for their specific format see the documentation of the set processing options (POST) endpoint.
id required | string^[0-9a-fA-F-]+$ |
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
Set the project processing options.
RESTful interaction with the project's processing options. A shortcut to define processing
options is to pass them to the start_processing
endpoint.
Create/overwrite processing options. Any option not passed in the request body will use the default value.
Processing options
tags
: keywords that describe the input data and can influence the type of
processing performed. Supported tags are:rtk
: GPS image data is captured using an RTK device.building
: input images are from oblique flights around objects with little texture.3d-maps
or nadir
: process with settings optimised for nadir images.oblique
: process with settings optimised for oblique images.half-scale
: scale down data processed in non-PIX4Dmapper pipelines.quarter-scale
: scale down further data processed in non-PIX4Dmapper pipelines.flat
: used together with the nadir tag, to process datasets of flat terrains.output_cs_horizontal
(Pix4D internal): EPSG code of the horizontal output coordinate
system (CS). Expected to be set if any other output_cs_*
parameter is set.output_cs_vertical
(Pix4D internal): EPSG code of the vertical output CS.
Must be defined with either output_cs_geoid or output_cs_geoid_height.output_cs_geoid
(Pix4D internal): name of the geoid to use with the output CS.output_cs_geoid_height
(Pix4D internal): the constant geoid height in meters to use
with the output CS. Note that currently this constant can not be used for US, Myanmar
and Liberia countries.outputs
: keywords that describe the desired output types to be generated. These
cannot be used in conjunction with any of the template arguments above. If no outputs
are passed, default ones will be generated. Supported values are:ortho
: Orthomosaic geotiffdsm
: DSM geotiffpoint_cloud
: Point Cloud las or laz depending on the context and calibrated camera
parameters filemesh
: Mesh as OBJ/material/texture files and offset xyz filegaussian_splatting
: Gaussian Splatting 3D modelformats
: dictionary where the key is an output
and the value valid extensions
e.g."formats": {"point_cloud": ["laz", "slpk"]},
area
: Region of interest defined in OPF
format with the coordinates in WGS84.
See https://github.com/Pix4D/opf-spec/blob/main/schema/plane.schema.json. Example:"plane": {
"vertices3d": [
[
3.2483144217356084,
43.41515239449451,
0
],
...
],
"outer_boundary": [
0,
...
]
},
"thickness": 10
standard_template
: the name of a valid Pix4Dmapper default template.
List available inside Pix4Dmapper. Incompatible with the custom_template_s3_key
option.
Not applied to projects coming from Pix4Dmapper (that already contain a fully configured
p4d
configuration file)custom_template_s3_key
: the full S3 key of the template file (.tmpl)
for Pix4Dmapper to use. Incompatible with the standard_template
option. Expected to be
set if custom_template_s3_bucket
is set. Not applied to projects coming from Pix4Dmapper
(that already contain a fully configured p4d
configuration file)custom_template_s3_bucket
: the bucket where the template file can be
found. Expected to be set if custom_template_s3_key
is set.id required | string^[0-9a-fA-F-]+$ |
tags | Array of strings or null (TagsEnum) Enum: "rtk" "building" "3d-maps" "nadir" "flat" "oblique" "half-scale" "quarter-scale" "high-confidence-positions" Project data classifications tags. Valid tags are: ['rtk', 'building', '3d-maps', 'nadir', 'flat', 'oblique', 'half-scale', 'quarter-scale', 'high-confidence-positions'] |
output_cs_horizontal | integer or null [ 1024 .. 32767 ] EPSG code of the desired output horizontal coordinate system |
output_cs_vertical | integer or null [ 1024 .. 32767 ] EPSG code of the desired output vertical coordinate system |
output_cs_geoid | string or null <= 50 characters |
output_cs_geoid_height | number or null <double> |
outputs | Array of strings or null (OutputsEnum) Enum: "ortho" "dsm" "point_cloud" "mesh" "gaussian_splatting" Outputs to be created when processing a project. Valid outputs are: ['ortho', 'dsm', 'point_cloud', 'mesh', 'gaussian_splatting'] |
object (FormatsFieldRequest) Specify formats for the requested outputs. Values for each output can define one or more formats from the available choices. Example:
| |
object (AreaRequest) | |
custom_template_s3_key | string or null <= 1024 characters Deprecated The S3 key for a valid pix4dmapper template .tmpl file |
custom_template_s3_bucket | string or null <= 63 characters Deprecated The S3 bucket for a valid pix4dmapper template .tmpl file |
(StandardTemplateEnum (string or null)) or (NullEnum (any or null)) Deprecated |
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
Update the project processing options.
RESTful interaction with the project's processing options. A shortcut to define processing
options is to pass them to the start_processing
endpoint.
Create/overwrite processing options. Any option not passed in the request body will use the default value.
This endpoint accepts parameters in the payload, for their specific format see the documentation of the set processing options (POST) endpoint.
id required | string^[0-9a-fA-F-]+$ |
tags | Array of strings or null (TagsEnum) Enum: "rtk" "building" "3d-maps" "nadir" "flat" "oblique" "half-scale" "quarter-scale" "high-confidence-positions" Project data classifications tags. Valid tags are: ['rtk', 'building', '3d-maps', 'nadir', 'flat', 'oblique', 'half-scale', 'quarter-scale', 'high-confidence-positions'] |
output_cs_horizontal | integer or null [ 1024 .. 32767 ] EPSG code of the desired output horizontal coordinate system |
output_cs_vertical | integer or null [ 1024 .. 32767 ] EPSG code of the desired output vertical coordinate system |
output_cs_geoid | string or null <= 50 characters |
output_cs_geoid_height | number or null <double> |
outputs | Array of strings or null (OutputsEnum) Enum: "ortho" "dsm" "point_cloud" "mesh" "gaussian_splatting" Outputs to be created when processing a project. Valid outputs are: ['ortho', 'dsm', 'point_cloud', 'mesh', 'gaussian_splatting'] |
object (FormatsFieldRequest) Specify formats for the requested outputs. Values for each output can define one or more formats from the available choices. Example:
| |
object (AreaRequest) | |
custom_template_s3_key | string or null <= 1024 characters Deprecated The S3 key for a valid pix4dmapper template .tmpl file |
custom_template_s3_bucket | string or null <= 63 characters Deprecated The S3 bucket for a valid pix4dmapper template .tmpl file |
(StandardTemplateEnum (string or null)) or (NullEnum (any or null)) Deprecated |
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
Update the project processing options.
RESTful interaction with the project's processing options. A shortcut to define processing
options is to pass them to the start_processing
endpoint.
Updates only the options passed in the payload. If called before POST, will use default values for all other options, as it is assumed that projects always use default options.
This endpoint accepts parameters in the payload, for their specific format see the documentation of the set processing options (POST) endpoint.
id required | string^[0-9a-fA-F-]+$ |
tags | Array of strings or null (TagsEnum) Enum: "rtk" "building" "3d-maps" "nadir" "flat" "oblique" "half-scale" "quarter-scale" "high-confidence-positions" Project data classifications tags. Valid tags are: ['rtk', 'building', '3d-maps', 'nadir', 'flat', 'oblique', 'half-scale', 'quarter-scale', 'high-confidence-positions'] |
output_cs_horizontal | integer or null [ 1024 .. 32767 ] EPSG code of the desired output horizontal coordinate system |
output_cs_vertical | integer or null [ 1024 .. 32767 ] EPSG code of the desired output vertical coordinate system |
output_cs_geoid | string or null <= 50 characters |
output_cs_geoid_height | number or null <double> |
outputs | Array of strings or null (OutputsEnum) Enum: "ortho" "dsm" "point_cloud" "mesh" "gaussian_splatting" Outputs to be created when processing a project. Valid outputs are: ['ortho', 'dsm', 'point_cloud', 'mesh', 'gaussian_splatting'] |
object (FormatsFieldRequest) Specify formats for the requested outputs. Values for each output can define one or more formats from the available choices. Example:
| |
object (AreaRequest) | |
custom_template_s3_key | string or null <= 1024 characters Deprecated The S3 key for a valid pix4dmapper template .tmpl file |
custom_template_s3_bucket | string or null <= 63 characters Deprecated The S3 bucket for a valid pix4dmapper template .tmpl file |
(StandardTemplateEnum (string or null)) or (NullEnum (any or null)) Deprecated |
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
Get the project S3 credentials.
Returns temporary AWS S3 credentials to access the project folder These credentials are either read-write or read-only, depending on the user's rights to access the project.
The key
returned is the only place in AWS S3 where you are allowed to write for that
project with the returned credentials. When uploading and registering files make sure that
all your paths are prefixed with this location.
id required | integer A unique integer value identifying this project. |
{- "access_key": "string",
- "secret_key": "string",
- "session_token": "string",
- "expiration": "2019-08-24T14:15:22Z",
- "bucket": "string",
- "key": "string",
- "server_time": "2019-08-24T14:15:22Z",
- "region": "string"
}
Expects input images to have been uploaded and to have been registered (will return a 400 if no images are registered) Will also fail if the project is already processing or has been deleted
This is also checking your licenses and will refuse to start if the number of images for the project is bigger than the allowed number for your project
If GCPs are registered in the project, then it will also validate that either
coordinate_system
is set in the project or output_cs_*
options are set in the
processing_options
.
Returns an estimation in seconds of the project processing time (estimated_time
) if the
project is not processed with credits. Else it returns 202 ACCEPTED
.
You can pass an optional boolean query parameter send_email
to disable automatic emails for the
project. The default value is True, that is email notifications are sent.
You can pass a payload of processing options (all optional) that will be passed to
pix4dmapper. For more details about the processing options, see the processing options
RESTful API documentation (POST /project/api/v3/projects/{id}/processing_options/
)
id required | integer A unique integer value identifying this project. |
tags | Array of strings or null (TagsEnum) Enum: "rtk" "building" "3d-maps" "nadir" "flat" "oblique" "half-scale" "quarter-scale" "high-confidence-positions" Project data classifications tags. Valid tags are: ['rtk', 'building', '3d-maps', 'nadir', 'flat', 'oblique', 'half-scale', 'quarter-scale', 'high-confidence-positions'] |
output_cs_horizontal | integer or null [ 1024 .. 32767 ] EPSG code of the desired output horizontal coordinate system |
output_cs_vertical | integer or null [ 1024 .. 32767 ] EPSG code of the desired output vertical coordinate system |
output_cs_geoid | string or null <= 50 characters |
output_cs_geoid_height | number or null <double> |
outputs | Array of strings or null (OutputsEnum) Enum: "ortho" "dsm" "point_cloud" "mesh" "gaussian_splatting" Outputs to be created when processing a project. Valid outputs are: ['ortho', 'dsm', 'point_cloud', 'mesh', 'gaussian_splatting'] |
object (FormatsFieldRequest) Specify formats for the requested outputs. Values for each output can define one or more formats from the available choices. Example:
| |
object (AreaRequest) | |
custom_template_s3_key | string or null <= 1024 characters Deprecated The S3 key for a valid pix4dmapper template .tmpl file |
custom_template_s3_bucket | string or null <= 63 characters Deprecated The S3 bucket for a valid pix4dmapper template .tmpl file |
(StandardTemplateEnum (string or null)) or (NullEnum (any or null)) Deprecated |
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
{- "tags": [
- "rtk"
], - "output_cs_horizontal": 1024,
- "output_cs_vertical": 1024,
- "output_cs_geoid": "string",
- "output_cs_geoid_height": 0,
- "outputs": [
- "ortho"
], - "formats": {
- "ortho": [
- "tif"
], - "dsm": [
- "tif"
], - "mesh": [
- "slpk"
], - "gaussian_splatting": [
- "gltf"
], - "point_cloud": [
- "laz"
]
}, - "area": {
- "plane": null,
- "thickness": 0
}, - "custom_template_s3_key": "string",
- "custom_template_s3_bucket": "string",
- "standard_template": "3d-maps"
}
Retrieve temporary AWS S3 credentials to access the projects and/or project groups passed in the payload. The credentials are always read-only.
This endpoint returns only the credentials, not the additional bucket/region/key information
as in the s3_credentials
endpoint.
Input payload can contain two keys project_ids
which must be a list of int project IDs
(not empty if provided) and project_group_ids
which must be a list of int project group
IDs (not empty if provided).
The number of projects + project_groups is limited to a small number due to
limitations for the length of AWS IAM policies. The limit is about 10. If the limit
is exceeded, a 400 error is raised with an error_code
of TOO_MANY_PROJECTS
If any of the projects and/or project groups in the list are:
project_ids | Array of integers non-empty |
project_group_ids | Array of integers non-empty |
{- "project_ids": [
- 0
], - "project_group_ids": [
- 0
]
}
{- "project_ids": [
- 0
], - "project_group_ids": [
- 0
]
}
Validate the WKT string.
Validation is done according to pix4dmapper requirements. Takes a json as input
with wkt_string
containing the wkt to validate
{
"wkt_string": "PROJCS["Custom coordinate system CUSTOM_OBLIQUE...
}
wkt_string required | string non-empty |
{- "wkt_string": "string"
}
{- "wkt_string": "string"
}
List projects groups.
ordering | string Which field to use when ordering the results. |
page | integer A page number within the paginated result set. |
page_size | integer Number of results to return per page. |
search | string A search term. |
{- "count": 123,
- "results": [
- {
- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "name": "string",
- "project_group_type": "pro",
- "is_aligned": false,
- "is_demo": true,
- "project_list_url": "string",
- "latest_project_date": "2019-08-24T14:15:22Z",
- "latest_project_url": "string",
- "latest_project_id": 0,
- "latitude": 0,
- "longitude": 0,
- "project_group_thumb": "string",
- "project_count": "string",
- "public_share_token": "string",
- "project_status_count": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}
}
]
}
Create a project group.
The coordinate system is compliant with Open Photogrammetry Format specification (OPF).
If passed, coordinate_system
is either:
EPSG:21781
)In addition the following values are accepted for arbitrary coordinate systems:
ARBITRARY_METERS
for the software to use an arbitrary default coordinate
system in metersARBITRARY_FEET
for the software to use an arbitrary default coordinate
system in feetARBITRARY_US_FEET
for the software to use an arbitrary default coordinate
system in us survey feetIf coordinate_system
is passed, two optionals fields can be added:
coordinate_system_geoid_height
as Float to define the constant geoid height
over the underlying ellipsoid in the units of the vertical CRS axis.coordinate_system_extensions
as JSON object with extension-specific objects.
(for more information: https://pix4d.github.io/opf-spec/specification/control_points.html#crs)Will return with a 400 error code if the coordinate_system
is invalid.
Unsupported cases:
If a coordinate system is provided to a Project Group, then new project created in this group will inherit from this coordinate system by default (if not specified).
project_group_type
can take the following values: pro
, bim
, model
, ag
.
Organization Management users should specify the 'parent' of the project group by passing:
parent_type
: one of: organization or folder.parent_uuid
: The uuid of the parent.name required | string [ 1 .. 100 ] characters |
project_group_type required | string (SolutionEnum) Enum: "pro" "bim" "ag" "model" "inspection"
|
owner_uuid | string or null <uuid> |
(OwnerTypeCe4Enum (string or null)) or (NullEnum (any or null)) | |
parent_id | string non-empty |
parent_uuid | string <uuid> |
parent_type | string (ProjectGroupParentTypeEnum) Enum: "organization" "user" "folder"
|
coordinate_system | string or null |
coordinate_system_geoid_height | number or null <double> |
coordinate_system_extensions | any or null |
{- "name": "string",
- "project_group_type": "pro",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "coordinate_system_geoid_height": 0,
- "coordinate_system_extensions": null
}
{- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "name": "string",
- "project_group_type": "pro",
- "is_aligned": false,
- "is_demo": true,
- "project_list_url": "string",
- "latest_project_date": "2019-08-24T14:15:22Z",
- "latest_project_url": "string",
- "latest_project_id": 0,
- "latitude": 0,
- "longitude": 0,
- "project_group_thumb": "string",
- "project_count": "string",
- "public_share_token": "string",
- "project_status_count": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}
}
Retrieve a project group.
id required | integer A unique integer value identifying this project group. |
{- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "name": "string",
- "project_group_type": "pro",
- "is_aligned": false,
- "is_demo": true,
- "project_list_url": "string",
- "latest_project_date": "2019-08-24T14:15:22Z",
- "latest_project_url": "string",
- "latest_project_id": 0,
- "latitude": 0,
- "longitude": 0,
- "project_group_thumb": "string",
- "project_count": "string",
- "public_share_token": "string",
- "project_status_count": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}
}
Partially update a project group.
The coordinate system is compliant with Open Photogrammetry Format specification (OPF).
If passed, coordinate_system
is either:
EPSG:21781
)In addition the following values are accepted for arbitrary coordinate systems:
ARBITRARY_METERS
for the software to use an arbitrary default coordinate
system in metersARBITRARY_FEET
for the software to use an arbitrary default coordinate
system in feetARBITRARY_US_FEET
for the software to use an arbitrary default coordinate
system in us survey feetIf coordinate_system
is passed, two optionals fields can be added:
coordinate_system_geoid_height
as Float to define the constant geoid height
over the underlying ellipsoid in the units of the vertical CRS axis.coordinate_system_extensions
as JSON object with extension-specific objects.
(for more information: https://pix4d.github.io/opf-spec/specification/control_points.html#crs)Will return with a 400 error code if the coordinate_system
is invalid.
Unsupported cases:
If a coordinate system is provided to a Project Group, then new project created in this group will inherit from this coordinate system by default (if not specified).
project_group_type
can take the following values: pro
, bim
, ag
.
id required | integer A unique integer value identifying this project group. |
name | string [ 1 .. 100 ] characters |
project_group_type | string (SolutionEnum) Enum: "pro" "bim" "ag" "model" "inspection"
|
owner_uuid | string or null <uuid> |
(OwnerTypeCe4Enum (string or null)) or (NullEnum (any or null)) | |
parent_id | string non-empty |
parent_uuid | string <uuid> |
parent_type | string (ProjectGroupParentTypeEnum) Enum: "organization" "user" "folder"
|
coordinate_system | string or null |
coordinate_system_geoid_height | number or null <double> |
coordinate_system_extensions | any or null |
{- "name": "string",
- "project_group_type": "pro",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "coordinate_system_geoid_height": 0,
- "coordinate_system_extensions": null
}
{- "id": 0,
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "name": "string",
- "project_group_type": "pro",
- "is_aligned": false,
- "is_demo": true,
- "project_list_url": "string",
- "latest_project_date": "2019-08-24T14:15:22Z",
- "latest_project_url": "string",
- "latest_project_id": 0,
- "latitude": 0,
- "longitude": 0,
- "project_group_thumb": "string",
- "project_count": "string",
- "public_share_token": "string",
- "project_status_count": "string",
- "owner_uuid": "a528e82a-c54a-4046-8831-44d7f9028f54",
- "owner_type": "ORG_GRP",
- "parent_id": "string",
- "parent_uuid": "77932ac3-028b-48fa-aaa9-4d11b1d1236a",
- "parent_type": "organization",
- "coordinate_system": "string",
- "crs": {
- "definition": "string",
- "geoid_height": 0,
- "extensions": null,
- "identifier": "string",
- "name": "string",
- "type": "string",
- "units": {
- "property1": null,
- "property2": null
}, - "axes": {
- "property1": null,
- "property2": null
}, - "epsg": 0,
- "esri": 0,
- "wkt1": "string",
- "wkt2": "string",
- "proj_pipeline_to_wgs84": "string"
}
}
Get the project group S3 credentials
Retrieve temporary AWS S3 credentials to access the project group folder. These credentials are either read-write or read-only, depending on the user's rights to access the project group.
The bucket and the region returned in the response are specific to the project group, and do not necessarily match the storage location of the projects belonging to the group.
Response
{
"access_key": "foo",
"secret_key": "secret",
"session_token": "session_token",
"expiration": 17200,
"bucket": "project-group-bucket",
"key": "S3-prefix-of-the-project-group-bucket"
"region": "project-group-region"
}
id required | integer A unique integer value identifying this project group. |
{- "access_key": "string",
- "secret_key": "string",
- "session_token": "string",
- "expiration": "2019-08-24T14:15:22Z",
- "bucket": "string",
- "key": "string",
- "server_time": "2019-08-24T14:15:22Z",
- "region": "string"
}
Return the current logged in user information
For external system references, the user uuid should be used, not the user id
preferred_units
can take the values metric | imperial
license_per_solution_summary
contains a value for each solution that can be
OK
: the user has a valid license for this solution (trial, or OTC with S&U or rental)OTC_EXPIRED
: the user has an OTC license for this solution but with expired S&UNONE
: the user has no valid license for this solutionNote: license_per_solution_summary
does not contain a detailed/granular view of the user
permissions. Use the permission endpoint for that.
{- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "first_name": "string",
- "last_name": "string",
- "email": "string",
- "preferred_units": "metric",
- "is_staff": "string",
- "is_partner": "string",
- "solution": "pro",
- "license_per_solution_summary": "string",
- "is_confirmed": "string",
- "preferred_language": "string",
- "portal_type": "string",
- "testing_group": "string",
- "trial_blacklist_reason": "string",
- "country": "AF",
- "city": "string",
- "zip": "string",
- "title": "string",
- "phone": "string",
- "preferred_theme": "dark",
- "default_organization": "570b8884-2314-432f-8691-2fac663f140c",
- "region": "string",
- "preferred_infra": 0,
- "is_active": "string",
- "is_eum_enabled": true,
- "is_free_domain": true,
- "auto_topup_store_product": "string",
- "auto_topup_order_reference": "string",
- "hubspot_id": "string"
}
List tokens created for your projects and project groups, or the ones owned by your organization. Tokens can be filtered using query parameters and available fields of the token.
For example: to filter tokens by project type, you can add
?type=Project
to the url.
When using within an organization, one of the following must be specified:
organization (with
owner_uuid`), ortype
and type_id
).enabled | boolean |
page | integer A page number within the paginated result set. |
page_size | integer Number of results to return per page. |
type | string Enum: "Project" "ProjectGroup"
|
type_id | integer |
write | boolean |
{- "count": 123,
- "results": [
- {
- "token": "string",
- "enabled": true,
- "write": true,
- "type": "Project",
- "type_id": 0,
- "creation_date": "2019-08-24T14:15:22Z",
- "created_by": "ee824cad-d7a6-4f48-87dc-e8461a9201c4"
}
]
}
Create a token for project or project group owned by you, or owned by your organization.
type
: Project
or ProjectGroup
.type_id
: ID of the project or project group.write
: true
for read/write tokens. false
for read-only tokens.Payload format:
{
"type": "Project",
"type_id": 123,
"write": true,
"enabled": true
}
It is recommended to use tokens instead of the deprecated public_url
you'll find in project and project group.
token | string non-empty |
enabled | boolean |
write | boolean |
type required | string (Type6f2Enum) Enum: "Project" "ProjectGroup"
|
type_id required | integer [ -2147483648 .. 2147483647 ] |
{- "token": "string",
- "enabled": true,
- "write": true,
- "type": "Project",
- "type_id": -2147483648
}
{- "token": "string",
- "enabled": true,
- "write": true,
- "type": "Project",
- "type_id": -2147483648,
- "creation_date": "2019-08-24T14:15:22Z"
}
Permission token used for sharing access.
id required | string <uuid> A UUID string identifying this permission token. |
{- "token": "string",
- "enabled": true,
- "write": true,
- "type": "Project",
- "type_id": 0,
- "creation_date": "2019-08-24T14:15:22Z",
- "created_by": "ee824cad-d7a6-4f48-87dc-e8461a9201c4"
}
Modify the enabled
or write
fields of the token.
Payload format:
{
"write": false,
"enabled": true
}
id required | string <uuid> A UUID string identifying this permission token. |
token | string non-empty |
enabled | boolean |
write | boolean |
{- "token": "string",
- "enabled": true,
- "write": true
}
{- "token": "string",
- "enabled": true,
- "write": true,
- "type": "Project",
- "type_id": 0,
- "creation_date": "2019-08-24T14:15:22Z",
- "created_by": "ee824cad-d7a6-4f48-87dc-e8461a9201c4"
}
Create new annotations by passing an array of annotations.
required | Array of objects (AnnotationCreate) |
{- "annotations": [
- {
- "entity_type": "Project",
- "entity_id": 0,
- "extension": { },
- "geometry": {
- "type": "Point",
- "bbox": [
- 0,
- 0,
- 0,
- 0
], - "coordinates": [
- 1.1,
- 6.7
]
}, - "properties": {
- "name": "string",
- "description": "string",
- "color": "#13579bdf",
- "color_fill": "#13579bdf",
- "camera_position": [
- 1.1,
- 6.7,
- 3.5
], - "visible": true,
- "volume": {
- "hash": "string",
- "cut": 0,
- "cut_error": 0,
- "fill": 0,
- "fill_error": 0,
- "custom_elevation": 0,
- "fitting": "Average"
}
}, - "attachments": [
- {
- "display_type": "image",
- "mime_type": "image/png",
- "source": {
- "s3_key": "string",
- "s3_bucket": "string",
- "s3_region": "string"
}, - "description": "string"
}
], - "tags": [
- "string"
], - "version": "1.0"
}
]
}
{- "annotations": [
- {
- "annotation_id": "Project_697180_6d4ab2d7-7c0b-4c41-8144-4b2e6717da14",
- "success": "true,"
}, - {
- "annotation_id": "Project_697180_d260700c-90da-427e-b504-12359def0be9",
- "success": "true,"
}, - {
- "annotation_id": "Project_697180_f6b47059-f007-47e4-af59-cf78a8e55622",
- "success": "true,"
}
]
}
List annotation according to specific criteria
entity_type required | string |
entity_id required | integer |
last_key | string |
page_size | integer |
shareToken | string |
{- "next": "string",
- "results": [
- {
- "entity_type": "Project",
- "entity_id": 0,
- "extension": { },
- "geometry": {
- "type": "Point",
- "bbox": [
- 0,
- 0,
- 0,
- 0
], - "coordinates": [
- 1.1,
- 6.7
]
}, - "properties": {
- "name": "string",
- "description": "string",
- "color": "#13579bdf",
- "color_fill": "#13579bdf",
- "camera_position": [
- 1.1,
- 6.7,
- 3.5
], - "visible": true,
- "volume": {
- "hash": "string",
- "cut": 0,
- "cut_error": 0,
- "fill": 0,
- "fill_error": 0,
- "custom_elevation": 0,
- "fitting": "Average"
}
}, - "attachments": [
- {
- "display_type": "image",
- "mime_type": "image/png",
- "source": {
- "s3_key": "string",
- "s3_bucket": "string",
- "s3_region": "string"
}, - "description": "string"
}
], - "tags": [
- "string"
], - "version": "1.0",
- "id": "Project_123456_12345678-1234-1234-1234-123456789abc",
- "created": "2021-09-16T07:05:39.610209+00:00",
- "modified": "2021-09-16T07:05:39.610209+00:00"
}
]
}
Delete list of annotations
annotations required | Array of strings (AnnotationId) |
{- "annotations": [
- "Project_700123_095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "Project_700123_8161163a-f227-466f-bc01-090a01e80165",
- "Project_700123_c7612fbf-6c9f-a8ad-4c33-8e9c095be615"
]
}
{- "annotations": [
- {
- "annotation_id": "Project_697180_6d4ab2d7-7c0b-4c41-8144-4b2e6717da14",
- "success": "true,"
}, - {
- "annotation_id": "Project_697180_d260700c-90da-427e-b504-12359def0be9",
- "success": "true,"
}, - {
- "annotation_id": "Project_697180_f6b47059-f007-47e4-af59-cf78a8e55622",
- "success": "true,"
}
]
}
annotation_id required | string |
entity_type required | string Value: "Project" The main entity that the given annotation will refer to. |
entity_id required | integer |
object | |
required | GeoJSONPoint (object) or GeoJSONLineString (object) or GeoJSONPolygon (object) or Circle (object) (Geometries) |
object (AnnotationProperties) | |
Array of objects (AnnotationAttachments) | |
tags | Array of strings (AnnotationTags) |
version | string (AnnotationVersion) version of an annotation schema in the format of MAJOR.MINOR |
{- "entity_type": "Project",
- "entity_id": 0,
- "extension": { },
- "geometry": {
- "type": "Point",
- "bbox": [
- 0,
- 0,
- 0,
- 0
], - "coordinates": [
- 1.1,
- 6.7
]
}, - "properties": {
- "name": "string",
- "description": "string",
- "color": "#13579bdf",
- "color_fill": "#13579bdf",
- "camera_position": [
- 1.1,
- 6.7,
- 3.5
], - "visible": true,
- "volume": {
- "hash": "string",
- "cut": 0,
- "cut_error": 0,
- "fill": 0,
- "fill_error": 0,
- "custom_elevation": 0,
- "fitting": "Average"
}
}, - "attachments": [
- {
- "display_type": "image",
- "mime_type": "image/png",
- "source": {
- "s3_key": "string",
- "s3_bucket": "string",
- "s3_region": "string"
}, - "description": "string"
}
], - "tags": [
- "string"
], - "version": "1.0"
}
{- "entity_type": "Project",
- "entity_id": 0,
- "extension": { },
- "geometry": {
- "type": "Point",
- "bbox": [
- 0,
- 0,
- 0,
- 0
], - "coordinates": [
- 1.1,
- 6.7
]
}, - "properties": {
- "name": "string",
- "description": "string",
- "color": "#13579bdf",
- "color_fill": "#13579bdf",
- "camera_position": [
- 1.1,
- 6.7,
- 3.5
], - "visible": true,
- "volume": {
- "hash": "string",
- "cut": 0,
- "cut_error": 0,
- "fill": 0,
- "fill_error": 0,
- "custom_elevation": 0,
- "fitting": "Average"
}
}, - "attachments": [
- {
- "display_type": "image",
- "mime_type": "image/png",
- "source": {
- "s3_key": "string",
- "s3_bucket": "string",
- "s3_region": "string"
}, - "description": "string"
}
], - "tags": [
- "string"
], - "version": "1.0",
- "id": "Project_123456_12345678-1234-1234-1234-123456789abc",
- "created": "2021-09-16T07:05:39.610209+00:00",
- "modified": "2021-09-16T07:05:39.610209+00:00"
}
Delete all annotations for a given entity
entity_type required | string |
entity_id required | integer |
shareToken | string |
{- "title": "Validation Error",
- "errors": {
- "entity_type": [
- "Missing data for required field."
], - "wrong_param": [
- "Unknown field."
]
}
}
Lists the folders, projects and projects groups in drive root or in a specific folder. This endpoint lists elements only one level deep and is not used to list hierarchical order of a folder tree. If a user has access within the Organization, but not at the Organization root, then listing the children of the Organization will return the "access points" granted within the organization (though do note that any access points that are nested inside another access point will not be returned).
parent_type
:It must be one of: organization, user, folder.
organization and user are used to list elements in the drive root.
folder is used to list elements in a specific folder.parent_uuid
: The uuid identifying the organization, user or folder.include_projects
: Used to filter the elements by projects.
By default, it is set to True.include_project_groups
: Used to filter the elements by project_groups.
By default, it is set to True.ordering
: Used to sort the elements either by name or by date.
The usage is one of these: ordering=date, ordering=-date, ordering=name,
ordering=-namefull_metadata
: Used to retrieve additional metadata of the drive object. The usage is
true or false regardless of case-sensitivity.200
: Successful400
: Invalid full_metadata param403
: The user is not permitted to access the parent404
: parent_type or parent_uuid is invalid or does not existparent_type required | string Enum: "folder" "organization" "user" |
parent_uuid required | string <uuid> |
full_metadata | boolean Default: false |
include_project_groups | boolean Default: true |
include_projects | boolean Default: true |
ordering | string Enum: "-date" "-name" "date" "name" |
page | integer |
page_size | integer |
{- "count": 31,
- "previous": null,
- "results": [
- {
- "legacy_id": 32,
- "uuid": "6d2e9879-2421-4bda-8288-1e8debc249a7",
- "type": "project",
- "date": "2023-05-17T16:35:31.183411+02:00",
- "name": "bar-4",
- "metadata": {
- "status": "CREATED",
- "latitude": 0,
- "longitude": 0
}
}
]
}
Returns a list of nodes which are the closest ancestors to the specified node.
Users without a role at the organization will have the list limited to the nodes to which they have access within the organization.
parent_type
:It must be one of: organization
, user
, folder
, project
or
project_group
.parent_uuid
: The uuid identifying the organization, user, folder, project or
project_group.max_path_items
: The maximum number of path items (including the current node) to return
for the given node. The default value is shown in the example below.200
: Successful400
: max_path_items param is invalid 403
: The user is not permitted to access the parent404
: parent_type or parent_uuid is invalid or does not existparent_type required | string Enum: "folder" "organization" "project" "project_group" "user" |
parent_uuid required | string <uuid> |
max_path_items | integer Default: 3 |
{- "path": [
- {
- "type": "folder",
- "uuid": "6c6b0736-c9a1-4eda-b03a-daf19b5d93a1",
- "name": "bar-4"
}
], - "owner": {
- "type": "organization",
- "uuid": "55ce2843-05c5-4772-9262-8748bf83ddac",
- "name": "ABC Corporation"
}, - "more_ancestors": true
}
Lists folders, projects and project groups whose (display) names match the supplied query parameter. Searching is case-insensitive and done to any depth in the resource tree of the organization or user that are specified by the parent_type and parent_uuid parameters.
parent_type
:It must be either organization or user.parent_uuid
: The uuid identifying the organization or user.ordering
: Used to sort the elements either by name or by date in ascending or descending
order, with an initial '-' indicating descending order. e.g. -date or name.full_metadata
: Used to retrieve additional metadata of the drive object. The usage is
true or false regardless of case-sensitivity.q
: The search text.exclude_grouped_projects
: If true then the search results will not include the projects
from inside groups.include_projects
: Used to filter the elements by projects.
By default, it is set to True.include_project_groups
: Used to filter the elements by project_groups.
By default, it is set to True.200
: Successful400
: Invalid full_metadata query param403
: The user is not permitted to access the parent404
: parent_type or parent_uuid is invalid or does not existparent_type required | string Enum: "organization" "user" |
parent_uuid required | string <uuid> |
exclude_grouped_projects | boolean Default: false |
full_metadata | boolean Default: false |
include_project_groups | boolean Default: true |
include_projects | boolean Default: true |
ordering | string Enum: "-date" "-name" "date" "name" |
page | integer |
page_size | integer |
q required | string |
{- "count": 1,
- "next": null,
- "previous": null,
- "results": [
- {
- "legacy_id": 316,
- "uuid": "c44d5e69-3f11-4509-9447-f93455d6825a",
- "type": "project",
- "date": "2021-03-04T12:06:22.710263+02:00",
- "name": "farm building",
- "metadata": {
- "status": "CREATED",
- "latitude": 0,
- "longitude": 0
}
}
]
}
Moves multiple nodes within an organization.
source_nodes
: The nodes that will be moved to the target.
The source_nodes should not be empty or repeat, and should all belong to the same parent.
Each of them require should be specified with target_type
: The type of the parent where nodes will be moved.target_uuid
: The uuid of the parent where nodes will be moved.owner_uuid
: The uuid of the organization inside which the nodes are being located.
(This is used to check the user has permissions to do this move).owner_uuid required | string <uuid> |
required | Array of objects (SourceNodeRequest) |
target_type required | string (MoveBatchTargetTypeEnum) Enum: "organization" "user" "folder" "project_group"
|
target_uuid required | string <uuid> |
{- "source_nodes": [
- {
- "type": "folder",
- "uuid": "3e5b9b56-3ae8-424a-bc15-ab7f42550adb"
}, - {
- "type": "project",
- "uuid": "1a0742b5-80fb-4a32-be65-e305c37c13c7"
}, - {
- "type": "project_group",
- "uuid": "e55e1729-0467-49c3-82b3-73e9dd88d41e"
}
], - "target_type": "folder",
- "target_uuid": "941f971b-bb21-47a2-a1da-6b2604ea9429",
- "owner_uuid": "e55e1729-0467-49c3-82b3-73e9dd88d41e"
}
Creates a new folder inside either the root of the drive or another folder. Required fields in request body:
name
(max length 255) : Folder nameparent_type
: The type of the parent of the folder to be created. It must be one of:
organization or folderparent_uuid
: The uuid of the parent of the folder to be created. It must be one of
the following according to the parent type specified in the request body: 201
: Folder creation successful400
: parent_type or parent_uuid is invalid403
: The user is not permitted to access the parent.name required | string [ 1 .. 255 ] characters |
parent_uuid required | string <uuid> |
parent_type required | string Enum: "folder" "organization" "user"
|
{- "name": "08-june-14:03",
- "parent_uuid": "d32ad3ae-ab2e-444a-bde3-55e9e01b9401",
- "parent_type": "organization"
}
{- "uuid": "43d218ce-62f6-42be-b765-c304f1229b11",
- "name": "example-folder-name"
}
Updates a folder's properties. Currently, we only allow updating the name property. Required fields in request body:
name
(max length 255) : Folder name200
: Update successful.400
: Invalid request body(for example: folder name is too long)403
: The user is not permitted to access the folder.uuid required | string <uuid> UUID of the folder to update |
name | string [ 1 .. 255 ] characters |
{- "name": "example_name"
}
{- "name": "example_name"
}
Deletes a folder along with all its descendants in the resource tree.
204
: Delete successful.
403
: The folder exists, but the user is not allowed to access it
404
: The folder does not exist
uuid required | string <uuid> UUID of the folder to delete |
List users granted access to a resource.
Returns a list of members. If there are none the list will be empty. An additional 'accessrole_uuid' will be returned for each of them which can be used with the remove_membership / update_role endpoints.
resource_type
: It must be one of: organization
, folder
, project
or
project_group
.resource_uuid
: The uuid identifying the organization, folder, project or project_group.page
: a page number within the paginated resultspage_size
: number of results to return per pagename
: An (optional) search text which limits the returned results to users who match it
in any of their email, first or last name.role
: One or more roles that if specified, limit the response to users with those roles.Returns HTTP status:
{
"count": 3,
"next": null,
"previous": null,
"results": [
{
"email": "bob@example.com",
"access_type": "ORGANIZATION",
"first_name": "Bob",
"last_name": "Jones",
"role": "OWNER",
"accessrole_uuid": "e30b1f48-f05e-41c2-be14-dcb53792bd2d",
"accessor_uuid": "b0034aa0-15b3-4699-96e9-7c73ec6265cb"
},
{
"email": "charlie@example.com",
"access_type": "INHERITED",
"first_name": "Charlie",
"last_name": "Forthright",
"role": "READER",
"accessrole_uuid": "505b2461-f0dd-4a67-9695-a9abd74747b7",
"accessor_uuid": "3dfd0afa-936f-49bc-b9e4-7722b0282207"
},
{
"email": "alice@example.com",
"access_type": "DIRECT",
"first_name": "Alice",
"last_name": "Smith",
"role": "EDITOR",
"accessrole_uuid": "eabd166e-d2fa-4d19-9df6-15f2b0f1e80f",
"accessor_uuid": "5cbc4c40-a572-4bd6-b0be-761713fe7d9f"
},
]
}
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
page | integer |
page_size | integer |
{- "count": 123,
- "results": [
- {
- "email": "string",
- "first_name": "string",
- "last_name": "string",
- "accessrole_uuid": "797f3a00-6c31-4476-ba87-2db7322966f8",
- "role": "string",
- "access_type": "string",
- "accessor_uuid": "f39b03e5-7e9d-4e91-af64-f0c684e99da6"
}
]
}
Returns the list of roles the current user can assign to other users to access the resource. If an assignee_uuid is provided, then the list of roles returned will be limited to roles that assignee does NOT already have at a higher resource in the tree.
The roles returned will not include roles which would be redundant given the user’s current roles higher up the path.
{
"roles": [
"MANAGER", "EDITOR"
]
}
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
{- "roles": [
- "EDITOR"
]
}
Get the details of a user's access for a given resource type and UUID.
Returns the details of a member accessing the given resource.
resource_type
: It must be one of: organization
, folder
, project
or
project_group
.resource_uuid
: The uuid identifying the organization, folder, project or project_group.Returns HTTP status:
{
"access_type": "INHERITED",
"role": "OWNER",
"uuid": "e30b1f48-f05e-41c2-be14-dcb53792bd2d",
"resource_uuid": "b3c1d98a-f4e6-4d58-ae3e-0cbba97e5e79",
"resource_type": "organization",
}
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
{- "access_type": "string",
- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "role": "string",
- "resource_uuid": "9a3c106a-0244-4962-b5f2-052f4eb77461",
- "resource_type": "string"
}
List invitations to a resource.
Returns a list of invitations. If there are none the list will be empty.
resource_type
: It must be one of: organization
, folder
, project
or
project_group
.resource_uuid
: The uuid identifying the organization, folder, project or project_group.page
: a page number within the paginated resultspage_size
: number of results to return per pagename
: An (optional) search text which limits the returned results to invited users who
match it in their email.role
: One or more roles that if specified, limit the response to users with those roles.Returns HTTP status:
{
"count": 3,
"next": null,
"previous": null,
"results": [
{
"email": "bob@example.com",
"uuid": "f4d30216-1de8-4483-9e81-da37fd15f282",
"expires_on": "2021-03-15T14:38:18.493014Z",
"invitation_status": "pending",
"role": "EDITOR"
}
]
}
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
{- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "expires_on": "2019-08-24T14:15:22Z",
- "role": "string",
- "email": "string",
- "invitation_status": "string"
}
Invites users or yet to be users to access a resource, OR updates the role of such users.
An email address should be provided for each person invited. Because users are specified by email address the caller does not need to know whether they are members of the organization or not.
A role parameter specifies a role that is to be applied to all users.
Finally, a redirect_url parameter specifies a page (using https and in the pix4d.com domain) that a user will be redirected to upon accepting the invitation.
Optionally a name used in the invitation email for external resources.
One of two emails will be sent to each user depending on whether they are members of the organization or not.
resource_type
: It must be one of: organization
, folder
, project
or project_group
.resource_uuid
: The uuid identifying the resource.Returns HTTP status:
[
{
"email": "newly-created@example.com",
"access_type": "DIRECT",
"first_name": "Bob",
"last_name": "Jones",
"role": "MANAGER",
"accessrole_uuid": "e30b1f48-f05e-41c2-be14-dcb53792bd2d"
},
{
"email": "invited@pix3d.com",
"expires_on": "2023-07-17T00:02:07.282222Z",
"invitation_status": "pending",
"role": "MANAGER",
"uuid": "564bc4f1-43a0-4be2-908d-11442cb73efb"
}
]
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
invitees required | Array of strings <email> [ 1 .. 500 ] items [ items <email > non-empty ] |
role required | string (RoleEnum) Enum: "EDITOR" "MANAGER" "OWNER" "READER"
|
resource_name | string non-empty |
{- "invitees": [
- "user@example.com"
], - "role": "EDITOR",
- "resource_name": "string"
}
{- "uuid": "095be615-a8ad-4c33-8e9c-c7612fbf6c9f",
- "expires_on": "2019-08-24T14:15:22Z",
- "role": "string",
- "email": "string",
- "invitation_status": "string"
}
Remove access to a resource for a user specified by accessrole uuid. If the resource is an organization then this will remove all access for the given user. Otherwise, for other resource types, note that this only removes access at the resource specified. The user may still be able to access the resource via access set at the organization or an intermediate resource.
resource_type
: It must be one of: organization
, folder
, project
or
project_group
.resource_uuid
: The uuid identifying the organization, folder, project or project_group.accessrole_uuid
: uuid of the access role to be removed.Returns HTTP status:
accessrole_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
resource_type required | string^[a-z][a-z0-9\-_]*$ |
resource_uuid required | string^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]... |
On october 12th 2021, the Release Notes section is inaugurated. From now on, the new implementations will be listed in this section together with the date when they are released.
As of today, the features included in the API are:
As our products evolve we need to occasionally remove and decommission functionality.
To ensure continuity for our API clients documentation will continue to be available here until the feature is finally removed.
We recommend reaching out to your PIX4D Sales/Support contact for assistance with choosing the best migration method away from these deprecated services.
POST on https://cloud.pix4d.com/project/api/v3/projects/{id}/start_processing/ See the full documentation for this endpoint.
The body request includes:
{
"custom_template_s3_key": "string",
"custom_template_s3_bucket": "string",
"standard_template": "string",
"tags": ["string"]
}
Standard processing with PIX4Dmapper
Processing with PIX4Dmapper allows computing with a different set of parameters which are called "templates". Depending on the type of flight and the desired outputs, a different template can be selected.
There are different default templates that can be used by PIX4Dmapper. Detailed information about all of them can be found in this support article
To use a default template, pass the name of the template in the value of the standard_template
key in the request body.
{
"standard_template": "<template-name>"
}
The strings which correspond to each of the default templates are listed below:
String | Default Template |
---|---|
3d-maps | 3D Maps |
3d-maps-rapid | 3D Maps - Rapid/Low Res |
3d-models | 3D Models |
3d-models-rapid | 3D Models - Rapid/Low Res |
ag-modified-camera | Ag Modified Camera |
ag-modified-camera-rapid | Ag Modified Camera - Rapid/Low Res |
ag-multispectral | Ag Multispectral |
ag-rgb | Ag RGB |
ag-rgb-rapid | Ag RGB - Rapid/Low Res |
thermal-camera | al Camera |
thermomap-camera | AP Camera |
For example, in order to process a project with the standard 3D model template:
Send a POST request to https://cloud.pix4d.com/project/api/v3/projects/{id}/start_processing/ with the following body:
{
"standard_template": "3d-models"
}
Even though there are several default templates that cover most use cases, it is also possible to create your own template for processing.
Information regarding how to create a template file (*.tmpl) can be found in this support article
It is recommended to use the PIX4Dmapper user interface as much as possible to create templates and export them to a file. PIX4Dmapper can be downloaded from the download-page.
Once the .tmpl file exists, there are two possibilities:
In both cases, in order to process a project with a user-defined template:
The request body must specify the custom_template_s3_key
and the custom_template_s3_bucket
:
{
"custom_template_s3_key": "string",
"custom_template_s3_bucket": "string"
}
PIX4Dmapper pipelines provide error reasons only, not error codes.
The API offers the option to upload an existing .p4d file and process with it. If that is the case, the information in that file will be taken into account during the processing.
In order to process with an exsiting .p4d file, the workflow is as follows:
It is recommended to use the PIX4Dmapper user interface as much as possible to create the .p4d file. Learn how to download Pix4Dmapper on this article
In order to create the project, open PIX4Dmapper and follow the steps explained on this support page
Please, follow what is explained in section Upload the photos in the first example guide.
Provided your .p4d file is located in a folder located at $HOME/p4d, you can open the AWS CLI and type:
aws s3 cp ./p4d/some_filename.p4d "s3://${S3_BUCKET}/${S3_BASE_PATH}/"
Make sure the S3 bucket and S3 key are correct.
POST on https://cloud.pix4d.com/project/api/v3/projects/{id}/extras/
Request body:
{
"file_key": "${S3_BASE_PATH}/some_filename.p4d"
}
POST on https://cloud.pix4d.com/project/api/v3/projects/{id}/start_processing/ ({id} is the project ID)
All the information contained in the .p4d file will be used in the computation: PIX4Dmapper version, coordinate systems, processing options, etc. This article explains the processing options which can be selected by the user: