Bulk API Developer's Guide: Version 21.0: Spring '11
Bulk API Developer's Guide: Version 21.0: Spring '11
0: Spring '11
Note: Any unreleased services or features referenced in this or other press releases or public statements are not currently available and may
not be delivered on time or at all. Customers who purchase our services should make their purchase decisions based upon features that are
currently available.
Table of Contents
Chapter 1: Introduction...............................................................................................................3
i
Table of Contents
Aborting a Job.............................................................................................................................................................33
Job and Batch Lifespan................................................................................................................................................34
Chapter 9: Reference.................................................................................................................45
Schema........................................................................................................................................................................46
JobInfo.........................................................................................................................................................................46
BatchInfo.....................................................................................................................................................................50
HTTP Status Codes...................................................................................................................................................52
Errors...........................................................................................................................................................................52
Bulk API Limits..........................................................................................................................................................54
Glossary...................................................................................................................................69
Index......................................................................................................................................................79
ii
Chapter 1
Introduction
The Bulk API provides programmatic access to allow you to quickly load your organization's data into Salesforce. To use this
document, you should have a basic familiarity with software development, Web services, and the Salesforce user interface.
Any functionality described in this guide is available only if your organization has the Bulk API feature enabled. This feature
is enabled by default for Unlimited, Enterprise, and Developer Editions.
3
Introduction
within the IDE itself. The Force.com Migration Tool is ideal if you want to use a script or a command-line utility for moving
metadata between a local directory and a Salesforce organization.
4
Chapter 2
Quick Start
Use the quick start sample in this section to create HTTP requests that insert new contact records using the REST-based Bulk
API. The instructions progress through logging in, submitting the records, checking status, and retrieving the results.
Note: Before you begin building an integration or other client application:
• Install your development platform according to its product documentation.
• Read through all the steps before beginning this quick start. You may also wish to review the rest of this document
to familiarize yourself with terms and concepts.
5
Quick Start Sending HTTP Requests with cURL
The part after the API version (Resource_address) varies depending on the job or batch being processed.
The easiest way to start using the Bulk API is to enable it for processing records in Data Loader using CSV files. If you use
Data Loader, you don't need craft your own HTTP requests or write your own client application. For an example of writing
a client application using Java, see Sample Client Application Using Java on page 57.
See Also:
About URIs
Data Loader Developer's Guide
2. Replace your_username and your_password with your Salesforce user name and password.
3. Using a command-line window, execute the following cURL command:
The Soap/u/ portion of the URI specifies the partner WSDL. You can use Soap/c/ to specify the enterprise WSDL.
4. Salesforce returns an XML response that includes <sessionId> and <serverUrl> elements. Note the values of the
<sessionId> element and the first part of the host name (instance), such as na1-api, from the <serverUrl> element.
Use these values in subsequent requests to the Bulk API.
6
Quick Start Step 2: Creating a Job
Note: If the <sessionId> includes an exclamation mark (!), it should be escaped with a backslash (\!) when
used in subsequent cURL commands.
See Also:
Setting a Session Header
Web Services API Developer's Guide
Caution: The operation value must be all lower case. For example, you get an error if you use INSERT instead of
insert.
7
Quick Start Step 3: Adding a Batch to the Job
<apiVersion>21.0</apiVersion>
</jobInfo>
3. Note the value of the job ID returned in the <id> element. Use this ID in subsequent operations.
See Also:
Creating a New Job
Format the data as either CSV or XML if you're not including binary attachments. For information about binary attachments,
see Loading Binary Attachments on page 22. For information about batch size limitations, see Batch size and limits on page
54.
This example shows CSV as this is the recommended format. It's your responsibility to divide up your data set in batches that
fit within the limits. In this example, we'll keep it very simple with just a few records.
To add a batch to a job:
1. Create a CSV file named data.csv with the following two records:
FirstName,LastName,Department,Birthdate,Description
Tom,Jones,Marketing,1940-06-07Z,"Self-described as ""the top"" branding guru on the West
Coast"
Ian,Dury,R&D,,"World-renowned expert in fuzzy logic design.
Influential in technology purchases."
Note that the value for the Description field in the last row spans multiple lines, so it's wrapped in double quotes.
2. Using a command-line window, execute the following cURL command:
curl https://instance.salesforce.com/services/async/21.0/job/jobId/batch -H
"X-SFDC-Session: sessionId" -H "Content-Type: text/csv; charset=UTF-8" --data-binary
@data.csv
instance is the portion of the <serverUrl> element and sessionId is the <sessionId> element that you noted in
the login response. jobId is the job ID that was returned when you created the job.
Salesforce returns an XML response with data such as the following:
Salesforce does not parse the CSV content or otherwise validate the batch until later. The response only acknowledges
that the batch was received.
8
Quick Start Step 4: Closing the Job
3. Note the value of the batch ID returned in the <id> element. You can use this batch ID later to check the status of the
batch.
See Also:
Preparing CSV Files
Adding a Batch to a Job
Bulk API Limits
instance is the portion of the <serverUrl> element and sessionId is the <sessionId> element that you noted in
the login response. jobId is the job ID that was returned when you created the job.
This cURL command updates the job resource state from Open to Closed.
See Also:
Closing a Job
9
Quick Start Step 6: Retrieving Batch Results
<createdDate>2009-09-01T17:44:45.000Z</createdDate>
<systemModstamp>2009-09-01T17:44:45.000Z</systemModstamp>
<numberRecordsProcessed>2</numberRecordsProcessed>
</batchInfo>
If Salesforce couldn't read the batch content or if the batch contained errors, such as invalid field names in the CSV header
row, the batch state is Failed. When batch state is Completed, all records in the batch have been processed. However,
individual records may have failed. You need to retrieve the batch result to see the status of individual records.
You don't have to check the status of each batch individually. You can check the status for all batches that are part of the job
by running the following cURL command:
curl https://instance.salesforce.com/services/async/21.0/job/jobId/batch -H "X-SFDC-Session:
sessionId"
instance is the portion of the <serverUrl> element and sessionId is the <sessionId> element that you noted in the
login response.. jobId is the job ID that was returned when you created the job.
See Also:
Getting Information for a Batch
Getting Information for All Batches in a Job
Interpreting Batch State
"Id","Success","Created","Error"
"003x0000004ouM4AAI","true","true",""
"003x0000004ouM5AAI","true","true",""
The response body is a CSV file with a row for each row in the batch request. If a record was created, the ID is contained in
the row. If a record was updated, the value in the Created column is false. If a record failed, the Error column contains an
error message.
See Also:
Getting Batch Results
Handling Failed Records in Batches
10
Chapter 3
Planning Bulk Data Loads
In this chapter ... In most circumstances, the Bulk API is significantly faster than the SOAP-based
API for loading large numbers of records. However, performance depends on
• General Guidelines for Data Loads the type of data that you're loading as well as any workflow rules and triggers
associated with the objects in your batches. It's useful to understand the factors
that determine optimal loading time.
11
Planning Bulk Data Loads General Guidelines for Data Loads
If you encounter errors related to these operations, create a separate job to process the data in serial mode.
Note: Because your data model is unique to your organization, salesforce.com can't predict exactly when you
might see lock contention problems.
12
Planning Bulk Data Loads General Guidelines for Data Loads
13
Chapter 4
Preparing Data Files
In this chapter ... The Bulk API processes records in comma-separated values (CSV) files or XML
files. This section tells you how to prepare your batches for processing. For
• Finding Field Names information about loading records containing binary attachments, see Loading
• Valid Date Format in Records Binary Attachments on page 22.
• Preparing CSV Files
• Preparing XML Files
14
Preparing Data Files Finding Field Names
15
Preparing Data Files Preparing CSV Files
FirstName,LastName,ReportsTo.Email
Tom,Jones,,[email protected]
16
Preparing Data Files Valid CSV Record Rows
identify the Mother Of Child field. To use a relationship name in a column header, replace the __c in the child object's
custom field with __r. For more information about relationships, see Understanding Relationship Names.
The following CSV file uses a relationship:
Name,Mother_Of_Child__r.External_ID__c
CustomObject1,123456
Subject,Priority,Status,Lead:Who.Email,Owner.Id
Test Bulk API polymorphic reference field,Normal,Not
Started,[email protected],005D0000001AXYz
Caution: The ObjectType: portion of a field column header is only required for a polymorphic field. You get an
error if you omit this syntax for a polymorphic field. You also get an error if you include this syntax for a field that is
not polymorphic.
FirstName,LastName,Title,ReportsTo.Email,Birthdate,Description
Tom,Jones,Senior Director,[email protected],1940-06-07Z,"Self-described as ""the
17
Preparing Data Files Preparing XML Files
Note that the Description field for the last record includes a line break, so the field value is enclosed in double quotes.
See Also:
Sample XML File
Data Loader Developer's Guide
<RelationshipName>
<sObject>
<IndexedFieldName>[email protected]</IndexedFieldName>
</sObject>
</RelationshipName>
Use the describeSObjects() call in the SOAP-based Web services API to get the relationshipName property value
for a field. You must use an indexed field to uniquely identify the parent record for the relationship. A standard field is indexed
if its idLookup property is set to true.
The following sample includes a contact record that includes the Reports To field, which is a reference to another contact.
ReportsTo is the relationshipName property value for the Reports To field. In this case, the parent object for the
Reports To field is also a contact, so we use the Email field to identify the parent record. The idLookup property value
for the Email field is true. To see the idLookup property for a field, see the Field Properties column in the field table for
each standard object.
18
Preparing Data Files Relationship Fields in Records
<FirstName>Ray</Name>
<LastName>Riordan</Description>
<ReportsTo>
<sObject>
<Email>[email protected]</Email>
</sObject>
</ReportsTo>
</sObject>
</sObjects>
<RelationshipName>
<sObject>
<type>ObjectTypeName</type>
<IndexedFieldName>[email protected]</IndexedFieldName>
</sObject>
</RelationshipName>
19
Preparing Data Files Valid XML Records
1. The WhoId field is polymorphic and has a relationshipName of Who. It refers to a lead and the indexed Email field
uniquely identifies the parent record.
2. The OwnerId field is not polymorphic and has a relationshipName of Owner. It refers to a user and the indexed Id
field uniquely identifies the parent record.
Caution: The <type>ObjectTypeName</type> element is only required for a polymorphic field. You get an error
if you omit this element for a polymorphic field. You also get an error if you include this syntax for a field that is not
polymorphic.
Note: You must include the type field for a polymorphic field and exclude it for non-polymorphic fields in any batch.
The batch fails if you do otherwise. A polymorphic field can refer to more than one type of object as a parent. For
example, either a contact or a lead can be the parent of a task. In other words, the WhoId field of a task can contain
the ID of either a contact or a lead.
20
Preparing Data Files Sample XML File
• Fields with a double data type can include fractional values. Values can be stored in scientific notation if the number is
large enough (or, for negative numbers, small enough), as indicated by the W3C XML Schema Part 2: Datatypes Second
Edition specification.
See Also:
Sample CSV File
21
Chapter 5
Loading Binary Attachments
In this chapter ... The Bulk API can load binary attachments, which can be Attachment objects
or Salesforce CRM Content.
• Creating a request.txt File
• Creating a Zip Batch File with
Binary Attachments
• Creating a Job for Batches with
Binary Attachments
• Creating a Batch with Binary
Attachments
22
Loading Binary Attachments Creating a request.txt File
For the Attachment object, the notation for the following fields is particularly important:
• The Name field is the file name of the binary attachment. The easiest way to get a unique name for each attachment in
your batch is to use the relative path from the base directory to the binary attachment. For example, attachment1.gif
or subdir/attachment2.doc.
• The Body is the relative path to the binary attachment, preceded with a # symbol. For example, #attachment1.gif
or#subdir/attachment2.doc.
• The ParentId field identifies the parent record, such as an account or a case, for the attachment.
The batch file can also include other optional Attachment fields, such as Description. For more information, see Attachment.
Name,ParentId,Body
attachment1.gif,Account Id,#attachment1.gif
subdir/attachment2.doc,Account Id,#subdir/attachment2.doc
23
Loading Binary Attachments Creating a Zip Batch File with Binary Attachments
</sObject>
</sObjects>
See Also:
Creating a Zip Batch File with Binary Attachments
See Also:
Creating a request.txt File
Note: The batches for this job contain data in CSV format so the contentType field is set to ZIP_CSV. For
XML batches, use ZIP_XML instead.
24
Loading Binary Attachments Creating a Batch with Binary Attachments
3. Note the value of the job ID returned in the <id> element. Use this ID in subsequent operations.
See Also:
Creating a Batch with Binary Attachments
Creating a New Job
25
Loading Binary Attachments Creating a Batch with Binary Attachments
Salesforce does not parse the CSV content or otherwise validate the batch until later. The response only acknowledges
that the batch was received.
3. Note the value of the batch ID returned in the <id> element. You can use this batch ID later to check the status of the
batch.
For details on proceeding to close the associated job, check batch status, and retrieve batch results, see the Quick Start.
See Also:
Creating a Job for Batches with Binary Attachments
Adding a Batch to a Job
26
Chapter 6
Request Basics
In this chapter ... This section describes some basics about the Bulk API, including the format of
URIs used to perform operations and details on how to authenticate requests
• About URIs using a session header.
• Setting a Session Header
27
Request Basics About URIs
About URIs
You send HTTP requests to a URI to perform operations with the Bulk API.
The URI where you send HTTP requests has the following format:
Web_Services_SOAP_endpoint_instance_name/services/async/APIversion/Resource_address
Think of the part of the URI through the API version as a base URI which is used for all operations. The part after the API
version (Resource_address) varies depending on the job or batch being processed. For example, if your organization is on
the na5 instance and you're working with version 21.0 of the Bulk API, your base URI would be
https://na5.salesforce.com/services/async/21.0.
The instance name for your organization is returned in the LoginResult serverUrl field.
See Also:
Working with Jobs
Working with Batches
See Also:
Quick Start
Sample Client Application Using Java
28
Chapter 7
Working with Jobs
In this chapter ... You process a set of records by creating a job that contains one or more batches.
The job specifies which object is being processed (for example, Account,
• Creating a New Job Opportunity) and what type of action is being used (insert, upsert, update, or
• Monitoring a Job delete).
• Closing a Job A job is represented by the JobInfo resource. This resource is used to create a
• Getting Job Details new job, get status for an existing job, and change status for a job.
• Aborting a Job
• Job and Batch Lifespan
29
Working with Jobs Creating a New Job
In this sample, the contentType field indicates that the batches associated with the job are in CSV format. For alternative
options, such as XML, see JobInfo on page 46.
Caution: The operation value must be all lower case. For example, you get an error if you use INSERT instead
of insert.
30
Working with Jobs Monitoring a Job
<contentType>CSV</contentType>
</jobInfo>
See Also:
Creating a Job for Batches with Binary Attachments
Getting Job Details
Closing a Job
Aborting a Job
Adding a Batch to a Job
Job and Batch Lifespan
Bulk API Limits
About URIs
JobInfo
Quick Start
Monitoring a Job
You can monitor a Bulk API job in Salesforce. The monitoring page tracks jobs and batches created by any client application,
including Data Loader or any client application that you write.
To track the status of bulk data load jobs that are in progress or recently completed, click Your Name ➤ Setup ➤ Monitoring
➤ Bulk Data Load Jobs.
For more information, see “Monitoring Bulk Data Load Jobs” in the Salesforce online help.
See Also:
Creating a New Job
Getting Job Details
Closing a Job
Aborting a Job
Adding a Batch to a Job
Job and Batch Lifespan
Bulk API Limits
Data Loader Developer's Guide
Closing a Job
Close a job by sending a POST request to the following URI. The request URI identifies the job to close. When a job is
closed, no more batches can be added.
URI
https://instance_name—api.salesforce.com/services/async/APIversion/job/jobId
31
Working with Jobs Getting Job Details
See Also:
Creating a New Job
Monitoring a Job
Getting Job Details
Aborting a Job
Job and Batch Lifespan
Bulk API Limits
About URIs
JobInfo
Quick Start
32
Working with Jobs Aborting a Job
<object>Account</object>
<createdById>005D0000001ALVFIA4</createdById>
<createdDate>2009-04-14T18:15:59.000Z</createdDate>
<systemModstamp>2009-04-14T18:15:59.000Z</systemModstamp>
<state>Closed</state>
</jobInfo>
See Also:
Creating a New Job
Monitoring a Job
Closing a Job
Aborting a Job
Adding a Batch to a Job
Job and Batch Lifespan
Bulk API Limits
About URIs
JobInfo
Quick Start
Aborting a Job
Abort an existing job by sending a POST request to the following URI. The request URI identifies the job to abort. When a
job is aborted, no more records are processed. Changes to data may already have been committed and aren't rolled back.
URI
https://instance_name—api.salesforce.com/services/async/APIversion/job/jobId
33
Working with Jobs Job and Batch Lifespan
<state>Aborted</state>
</jobInfo>
See Also:
Getting Job Details
Creating a New Job
Monitoring a Job
Closing a Job
Job and Batch Lifespan
Bulk API Limits
About URIs
JobInfo
See Also:
Creating a New Job
Monitoring a Job
Getting Job Details
Closing a Job
Aborting a Job
Adding a Batch to a Job
Bulk API Limits
About URIs
JobInfo
Quick Start
34
Chapter 8
Working with Batches
In this chapter ... A batch is a set of records sent to the server in an HTTP POST request. Each
batch is processed independently by the server, not necessarily in the order it is
• Adding a Batch to a Job received.
• Monitoring a Batch
A batch is created by submitting a CSV or XML representation of a set of records
• Getting Information for a Batch and any references to binary attachments in an HTTP POST request. Once
• Getting Information for All Batches created, the status of a batch is represented by a BatchInfo resource. When a
in a Job batch is complete, the result for each record is available in a result set resource.
• Interpreting Batch State Batches may be processed in parallel. It's up to the client to decide how to divide
• Getting a Batch Request the entire data set into a suitable number of batches.
• Getting Batch Results
Batch sizes should be adjusted based on processing times. Start with 5000 records
• Handling Failed Records in Batches and adjust the batch size based on processing time. If it takes more than five
minutes to process a batch, it may be beneficial to reduce the batch size. If it
takes a few seconds, the batch size should be increased. If you get a timeout error
when processing a batch, split your batch into smaller batches, and try again.
35
Working with Batches Adding a Batch to a Job
In this sample, the batch data is in XML format because the contentType field of the associated job was set to XML.
For alternative formats for batch data, such as CSV, see JobInfo on page 46.
36
Working with Batches Monitoring a Batch
<numberRecordsProcessed>0</numberRecordsProcessed>
</batchInfo>
See Also:
Creating a Batch with Binary Attachments
Getting Information for a Batch
Monitoring a Batch
Getting Information for All Batches in a Job
Interpreting Batch State
Getting a Batch Request
Getting Batch Results
Working with Jobs
Job and Batch Lifespan
Bulk API Limits
About URIs
BatchInfo
Quick Start
Monitoring a Batch
You can monitor a Bulk API batch in Salesforce.
To track the status of bulk data load jobs and their associated batches, click Your Name ➤ Setup ➤ Monitoring ➤ Bulk
Data Load Jobs. Click on the Job ID to view the job detail page.
The job detail page includes a related list of all the batches for the job. The related list provides View Request and View
Response links for each batch. If the batch is a CSV file, the links return the request or response in CSV format. If the batch
is an XML file, the links return the request or response in XML format. These links are available for batches created in API
version 19.0 and later.
37
Working with Batches Getting Information for a Batch
For more information, see “Monitoring Bulk Data Load Jobs” in the Salesforce online help.
See Also:
Getting Information for a Batch
Adding a Batch to a Job
Getting Information for All Batches in a Job
Interpreting Batch State
Getting a Batch Request
Getting Batch Results
Handling Failed Records in Batches
Working with Jobs
Job and Batch Lifespan
Bulk API Limits
About URIs
BatchInfo
Quick Start
38
Working with Batches Getting Information for All Batches in a Job
<numberRecordsProcessed>0</numberRecordsProcessed>
</batchInfo>
See Also:
Adding a Batch to a Job
Monitoring a Batch
Getting Information for All Batches in a Job
Interpreting Batch State
Getting a Batch Request
Getting Batch Results
Job and Batch Lifespan
Bulk API Limits
BatchInfo
About URIs
Working with Jobs
Quick Start
Method
GET
39
Working with Batches Interpreting Batch State
</batchInfo>
</batchInfoList>
See Also:
Adding a Batch to a Job
Monitoring a Batch
Getting Information for a Batch
Interpreting Batch State
Getting a Batch Request
Getting Batch Results
Job and Batch Lifespan
Bulk API Limits
BatchInfo
About URIs
Working with Jobs
Quick Start
InProgress
The batch is currently being processed. If the job associated with this batch is aborted, this batch is still processed to
completion.
Completed
The batch has been processed completely and the result resource is available. The result resource indicates if some records
have failed. A batch can be completed even if some or all the records have failed. If a subset of records failed, the successful
records aren't rolled back.
Failed
The batch failed to process the full request due to an unexpected error, such as the request being compressed with an
unsupported format, or an internal server error. Even if the batch failed, some records could have been completed
successfully. If the numberRecordsProcessed field in the response is greater than zero, you should get the results to
see which records were processed, and if they were successful.
40
Working with Batches Getting a Batch Request
Not Processed
The batch won't be processed. This state is assigned when a job is aborted while the batch is queued.
See Also:
Adding a Batch to a Job
Monitoring a Batch
Getting Information for All Batches in a Job
Getting a Batch Request
Getting Batch Results
Handling Failed Records in Batches
Job and Batch Lifespan
Bulk API Limits
BatchInfo
About URIs
Working with Jobs
Quick Start
URI
https://instance_name—api.salesforce.com/services/async/APIversion/job/jobid/batch/batchId/request
41
Working with Batches Getting Batch Results
</sObject>
</sObjects>
See Also:
Getting Information for a Batch
Monitoring a Batch
Getting Information for All Batches in a Job
Interpreting Batch State
Getting Batch Results
Working with Jobs
Job and Batch Lifespan
Bulk API Limits
About URIs
BatchInfo
Quick Start
42
Working with Batches Handling Failed Records in Batches
"Id","Success","Created","Error"
"003D000000Q89kQIAR","true","true",""
"003D000000Q89kRIAR","true","true",""
"","false","false","REQUIRED_FIELD_MISSING:Required fields are missing:
[LastName]:LastName --"
Note: The batch result indicates that the last record was not processed successfully because the LastName field
was missing. The Error column includes error information. You must look at the Success field for each result
row to ensure that all rows were processed successfully. For more information, see Handling Failed Records in
Batches on page 43.
See Also:
Adding a Batch to a Job
Monitoring a Batch
Getting a Batch Request
Getting Information for a Batch
Getting Information for All Batches in a Job
Interpreting Batch State
Job and Batch Lifespan
Bulk API Limits
BatchInfo
About URIs
Working with Jobs
Quick Start
"Id","Success","Created","Error"
"003D000000Q89kQIAR","true","true",""
"003D000000Q89kRIAR","true","true",""
43
Working with Batches Handling Failed Records in Batches
After you have examined each result record, you can manually fix each record in the error file and submit these records in a
new batch. Repeat the earlier steps to check that each record is processed successfully.
See Also:
Adding a Batch to a Job
Errors
Bulk API Limits
44
Chapter 9
Reference
In this chapter ... This section describes the supported resources for the Bulk API, as well as details
on errors and processing limits.
• Schema
• JobInfo
• BatchInfo
• HTTP Status Codes
• Errors
• Bulk API Limits
45
Reference Schema
Schema
The Bulk API service is described by an XML Schema Document (XSD) file. You can download the schema file for an API
version by using the following URI:
Web_Services_SOAP_endpoint_instance_name/services/async/APIversion/AsyncApi.xsd
For example, if your organization is on the na5 instance and you're working with version 21.0 of the Bulk API, the URI is:
https://na5.salesforce.com/services/async/21.0/AsyncApi.xsd
The instance name for your organization is returned in the LoginResult serverUrl field.
See Also:
JobInfo
BatchInfo
Errors
JobInfo
A job contains one or more batches of data for you to submit to Salesforce for processing. When a job is created, Salesforce
sets the job state to Open.
You can create a new job, get information about a job, close a job, or abort a job using the JobInfo resource.
Fields
46
Reference JobInfo
assignmentRuleId string Can't update The ID of a specific assignment rule to run for a
after creation. case or a lead. The assignment rule can be active
or inactive. The ID can be retrieved by using the
SOAP-based Web services API to query the
AssignmentRule object.
concurrencyMode ConcurrencyModeEnum The concurrency mode for the job. The valid values
are:
• Parallel: Process batches in parallel mode.
This is the default value.
• Serial: Process batches in serial mode.
Processing in parallel can cause database
contention. When this is severe, the job may
fail. If you're experiencing this issue, submit the
job with serial concurrency mode. This
guarantees that batches are processed one at a
time. Note that using this option may
significantly increase the processing time for a
job.
contentType ContentType The content type for the job. The valid values are:
• CSV—data in CSV format
• XML—data in XML format (default option)
• ZIP_CSV—data in CSV format in a zip file
containing binary attachments
• ZIP_XML—data in XML format in a zip file
containing binary attachments
createdById string System field The ID of the user who created this job. All batches
must be created by this same user.
createdDate dateTime System field The date and time in the UTC time zone when
the job was created.
externalIdFieldName string Required with The name of the external ID field for an
upsert upsert().
47
Reference JobInfo
numberRecordsFailed int Do not specify The number of records that were not processed
for new job. successfully in this job.
This field is available in API version 19.0 and later.
numberRecordsProcessed int Do not specify The number of records already processed. This
for new job. number increases as more batches are processed.
numberRetries int The number of times that Salesforce attempted to
save the results of an operation. The repeated
attempts are due to a problem, such as a lock
contention.
object string Required The object type for the data being processed. All
data in a job must be of a single object type.
operation OperationEnum Required The processing operation for all the batches in the
job. The valid values are:
• delete
• insert
• query—Pilot only
• upsert
• update
• hardDelete
Caution: The operation value must be all
lower case. For example, you get an error
if you use INSERT instead of insert.
48
Reference JobInfo
state JobStateEnum Required if The current state of processing for the job:
creating, closing, • Open: The job has been created, and batches
or aborting a job. can be added to the job.
• Closed: No new batches can be added to this
job. Batches associated with the job may be
processed after a job is closed. You cannot edit
or save a closed job.
• Aborted: The job has been aborted. You can
abort a job if you created it or if you have the
“Manage Data Integrations” permission in your
profile.
• Failed: The job has failed. Batches that were
successfully processed can't be rolled back. The
BatchInfoList contains a list of all batches for
the job. From the results of BatchInfoList,
results can be retrieved for completed batches.
The results indicate which records have been
processed. The numberRecordsFailed field
contains the number of records that were not
processed successfully.
systemModstamp dateTime System field Date and time in the UTC time zone when the job
finished.
totalProcessingTime long Do not specify The number of milliseconds taken to process the
for new job. job. This is the sum of the total processing times
49
Reference BatchInfo
See Also:
Working with Jobs
Quick Start
Web Services API Developer's Guide
BatchInfo
A BatchInfo contains one batch of data for you to submit to Salesforce for processing.
BatchInfo
apiActiveProcessingTime long System The number of milliseconds taken to actively process the batch,
field and includes apexProcessingTime. This doesn't include the
time the batch waited in the queue to be processed or the time
required for serialization and deserialization. See also
totalProcessingTime.
This field is available in API version 19.0 and later.
createdDate dateTime System The date and time in the UTC time zone when the batch was
field created. This is not the time processing began, but the time the
batch was added to the job.
id string Required The ID of the batch. May be globally unique, but does not have
to be.
jobId string Required The unique, 18–character ID for the job associated with this
batch.
numberRecordsFailed int System The number of records that were not processed successfully in
field this batch.
This field is available in API version 19.0 and later.
50
Reference BatchInfo
stateMessage string System Contains details about the state. For example, if the state value
field is Failed, this field contains the reasons for failure. If there are
multiple failures, the message may be truncated. If so, fix the
known errors and re-submit the batch. Even if the batch failed,
some records could have been completed successfully.
systemModstamp dateTime System The date and time in the UTC time zone that processing ended.
field This is only valid when the state is Completed.
totalProcessingTime long System The number of milliseconds taken to process the batch. This
field excludes the time the batch waited in the queue to be processed.
See also apexProcessingTime and
apiActiveProcessingTime.
This field is available in API version 19.0 and later.
51
Reference HTTP Status Codes
HTTP BatchInfoList
See Also:
Working with Batches
Interpreting Batch State
Quick Start
Web Services API Developer's Guide
HTTP 400
The operation failed to complete successfully due to an invalid request.
HTTP 405
An HTTP method other than GET or POST was sent to the URI.
HTTP 415
You may have set compression to an unsupported value. The only valid compression value is gzip. Compression is
optional, but strongly recommended.
HTTP 500
Generally, a server error.
Errors
Operations that you perform with Bulk API might trigger error codes. The following list shows the most common error codes
and the Bulk API action that might have triggered them.
ClientInputError
The operation failed with an unknown client-side error.
For binary attachments, the request content is provided both as an input stream and an attachment.
ExceededQuota
The job or batch you tried to create exceeds the allowed number for the past 24 hour period.
52
Reference Errors
FeatureNotEnabled
The Bulk API is not enabled for this organization.
InvalidBatch
The batch ID specified in a batch update or query is invalid.
This error code is returned for binary attachments when the zip content is malformed or the following conditions occur:
• The request.txt file can't be found, can't be read, is a directory, or contains invalid content.
• The decompressed size of a binary attachment is too large.
• The size of the zip file is too large.
• The total decompressed size of all the binary attachments is too large.
Note: A StatusCode of INVALID_FIELD is returned for the following conditions:
• A binary file referenced in the batch data is missing or is a directory.
• A binary file referenced in the batch data doesn't start with #.
For more information about binary attachment limits, see Binary content on page 55.
InvalidJob
The job ID specified in a query or update for a job, or a create, update, or query for batches is invalid.
The user attempted to create a job using a zip content type in API version 19.0 or earlier.
InvalidJobState
The job state specified in a job update operation is invalid.
InvalidOperation
The operation specified in a URI for a job is invalid. Check the spelling of “job” in the URI.
InvalidSessionId
The session ID specified is invalid.
InvalidUrl
The URI specified is invalid.
InvalidUser
Either the user sending an Bulk API request doesn't have the correct permission, or the job or batch specified was created
by another user.
InvalidXML
XML contained in the request body is invalid.
Timeout
The connection timed out. This error is thrown if Salesforce takes too long to process a batch. For more information
on timeout limits, see Batch processing time on page 54. If you get a timeout error when processing a batch, split your
batch into smaller batches, and try again.
TooManyLockFailure
Too many lock failures while processing the current batch. This error may be returned during processing of a batch. To
resolve, analyze the batches for lock conflicts. See General Guidelines for Data Loads on page 12.
53
Reference Bulk API Limits
Unknown
Exception with unknown cause occurred.
In addition, Bulk API uses the same status codes and exception codes as the Web services API. For more information on
these codes, see “ExceptionCode” in the Web Services API Developer's Guide.
See Also:
HTTP Status Codes
Handling Failed Records in Batches
Batch content
Each batch must contain exactly one CSV or XML file containing records for a single object, or the batch is not processed
and stateMessage is updated. Use the enterprise WSDL for the correct format for object records.
Batch limit
You can submit up to 1000 batches per rolling 24 hour period. You can't create new batches associated with a job that
is more than 24 hours old.
Batch lifespan
Batches and jobs that are older than seven days are removed from the queue regardless of job status. The seven days is
measured from the youngest batch associated with a job, or the age of the job if there are no batches. You can't create
new batches associated with a job that is more than 24 hours old.
Batch size
• Batches can consist of a single CSV or XML file that can be no larger than 10 MB.
• A batch can contain a maximum of 10,000 records.
• A batch can contain a maximum of 10,000,000 characters for all the data in a batch.
• A field can contain a maximum of 32,000 characters.
• A record can contain a maximum of 5,000 fields.
• A record can contain a maximum of 400,000 characters for all its fields.
• A batch must contain some content or an error occurs.
54
Reference Bulk API Limits
Even if the batch failed, some records could have completed successfully. To get batch results to see which records, if
any, were processed, see Getting Batch Results on page 42. If you get a timeout error when processing a batch, split
your batch into smaller batches, and try again.
Binary content
Compression
The only valid compression value is gzip. Compression is optional, but strongly recommended. Note that compression
doesn't affect the character limits defined in Batch size.
Job abort
Any user with correct permission can abort a job. Only the user who created a job can close it.
Job close
Only the user who created a job can close it. Any user with correct permission can abort a job.
Job content
Each job can specify one operation and one object. Batches associated with this job contains records of one object.
Optionally, the job may specify serial processing mode, which is used only when previously submitted asynchronous jobs
have accidentally produced contention because of locks. Use only when advised by Salesforce.
Job external ID
You can't edit the value of an external ID field in JobInfo. When specifying an external ID, the operation must be upsert.
If you try to use it with create or update, an error is generated.
Job lifespan
Batches and jobs that are older than seven days are removed from the queue regardless of job status. The seven days is
measured from the youngest batch associated with a job, or the age of the job if there are no batches. You can't create
new batches associated with a job that is more than 24 hours old.
Portal users
Regardless of whether the “API Enabled” profile permission is granted, portal users (Customer Portal, Self-Service
portal, Partner Portal and PRM Portal) can't access Bulk API.
55
Reference Bulk API Limits
query SOQL—Pilot
Bulk API does not support the following SOQL:
• COUNT
• ROLLUP
• SUM
• Nested SOQL queries, for example:
56
Appendix
A
Sample Client Application Using Java
Use the code sample in this section to create a test client application that inserts a number of account records using the
REST-based Bulk API.
In addition to the step-by-step instructions that follow, the end of this section provides the complete code for you, to make
copy and pasting easier.
Note: Before you begin building an integration or other client application:
• Install your development platform according to its product documentation.
• Read through all the steps before creating the test client application. You may also wish to review the rest of this
document to familiarize yourself with terms and concepts.
57
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
For example, if wsc.jar is installed in C:\salesforce\wsc, and the partner WSDL is saved to
C:\salesforce\wsdl\partner:
wsc.jar and the generated partner.jar are the only libraries needed in the classpath for the code examples in the
following sections.
import java.io.*;
import java.util.*;
import com.sforce.async.*;
import com.sforce.soap.partner.PartnerConnection;
import com.sforce.ws.ConnectionException;
import com.sforce.ws.ConnectorConfig;
58
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
/**
* Creates a Bulk API job and uploads batches for a CSV file.
*/
public void runSample(String sobjectType, String userName,
String password, String sampleFileName)
throws AsyncApiException, ConnectionException, IOException {
RestConnection connection = getRestConnection(userName, password);
JobInfo job = createJob(sobjectType, connection);
List<BatchInfo> batchInfoList = createBatchesFromCSVFile(connection, job,
sampleFileName);
closeJob(connection, job.getId());
awaitCompletion(connection, job, batchInfoList);
checkResults(connection, job, batchInfoList);
}
/**
* Create the RestConnection used to call Bulk API operations.
*/
private RestConnection getRestConnection(String userName, String password)
throws ConnectionException, AsyncApiException {
ConnectorConfig partnerConfig = new ConnectorConfig();
partnerConfig.setUsername(userName);
partnerConfig.setPassword(password);
partnerConfig.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/21.0");
59
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
config.setTraceMessage(false);
RestConnection connection = new RestConnection(config);
return connection;
}
This RestConnection instance is the base for using the Bulk API. The instance can be reused for the rest of the application
life span.
/**
* Create a new job using the Bulk API.
*
* @param sobjectType
* The object type being loaded, such as "Account"
* @param connection
* RestConnection used to create the new job.
* @return The JobInfo for the new job.
* @throws AsyncApiException
*/
private JobInfo createJob(String sobjectType, RestConnection connection)
throws AsyncApiException {
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(OperationEnum.insert);
job.setContentType(ContentType.CSV);
job = connection.createJob(job);
System.out.println(job);
return job;
}
When a job is created, it's in the Open state. In this state new batches can be added to the job. Once a job is Closed, batches
can no longer be added.
/**
* Create and upload batches using a CSV file.
* The file into the appropriate size batch files.
*
* @param connection
* Connection to use for creating batches
* @param jobInfo
* Job associated with new batches
* @param csvFileName
* The source file for batch data
60
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
*/
private List<BatchInfo> createBatchesFromCSVFile(RestConnection connection,
JobInfo jobInfo, String csvFileName)
throws IOException, AsyncApiException {
List<BatchInfo> batchInfos = new ArrayList<BatchInfo>();
BufferedReader rdr = new BufferedReader(
new InputStreamReader(new FileInputStream(csvFileName))
);
// read the CSV header row
byte[] headerBytes = (rdr.readLine() + "\n").getBytes("UTF-8");
int headerBytesLength = headerBytes.length;
File tmpFile = File.createTempFile("bulkAPIInsert", ".csv");
/**
* Create a batch by uploading the contents of the file.
* This closes the output stream.
*
* @param tmpOut
* The output stream used to write the CSV data for a single batch.
* @param tmpFile
* The file associated with the above stream.
* @param batchInfos
* The batch info for the newly created batch is added to this list.
* @param connection
* The RestConnection used to create the new batch.
* @param jobInfo
* The JobInfo associated with the new batch.
*/
private void createBatch(FileOutputStream tmpOut, File tmpFile,
61
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
} finally {
tmpInputStream.close();
}
}
Once the server receives a batch it's immediately queued for processing. Any errors in formatting aren't reported when sending
the batch. These errors are reported in the result data when the batch is processed.
Tip: To import binary attachments, use the following methods. Specify the CSV or XML content for the batch in
the batchContent parameter, or include request.txt in the attached files and pass null to the batchContent
parameter. These methods are contained within the com.async.RestConnection package:
• createBatchFromDir()
• createBatchWithFileAttachments()
• createBatchWithInputStreamAttachments()
• createBatchFromZipStream()
/**
* Wait for a job to complete by polling the Bulk API.
*
* @param connection
* RestConnection used to check results.
* @param job
* The job awaiting completion.
* @param batchInfoList
* List of batches for this job.
* @throws AsyncApiException
*/
private void awaitCompletion(RestConnection connection, JobInfo job,
List<BatchInfo> batchInfoList)
throws AsyncApiException {
62
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
A batch is done when it's either failed or completed. This code loops infinitely until all the batches for the job have either
failed or completed.
/**
* Gets the results of the operation and checks for errors.
*/
private void checkResults(RestConnection connection, JobInfo job,
List<BatchInfo> batchInfoList)
throws AsyncApiException, IOException {
// batchInfoList was populated when batches were created and submitted
for (BatchInfo b : batchInfoList) {
CSVReader rdr =
new CSVReader(connection.getBatchResultStream(job.getId(), b.getId()));
List<String> resultHeader = rdr.nextRecord();
int resultCols = resultHeader.size();
List<String> row;
while ((row = rdr.nextRecord()) != null) {
Map<String, String> resultInfo = new HashMap<String, String>();
for (int i = 0; i < resultCols; i++) {
resultInfo.put(resultHeader.get(i), row.get(i));
}
boolean success = Boolean.valueOf(resultInfo.get("Success"));
boolean created = Boolean.valueOf(resultInfo.get("Created"));
String id = resultInfo.get("Id");
String error = resultInfo.get("Error");
if (success && created) {
System.out.println("Created row with id " + id);
} else if (!success) {
System.out.println("Failed with error: " + error);
}
63
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
}
}
}
This code retrieves the results for each record and reports whether the operation succeeded or failed. If an error occurred for
a record, the code prints out the error.
import java.io.*;
import java.util.*;
import com.sforce.async.*;
import com.sforce.soap.partner.PartnerConnection;
import com.sforce.ws.ConnectionException;
import com.sforce.ws.ConnectorConfig;
/**
* Creates a Bulk API job and uploads batches for a CSV file.
*/
public void runSample(String sobjectType, String userName,
String password, String sampleFileName)
throws AsyncApiException, ConnectionException, IOException {
RestConnection connection = getRestConnection(userName, password);
JobInfo job = createJob(sobjectType, connection);
List<BatchInfo> batchInfoList = createBatchesFromCSVFile(connection, job,
sampleFileName);
closeJob(connection, job.getId());
awaitCompletion(connection, job, batchInfoList);
checkResults(connection, job, batchInfoList);
}
/**
* Gets the results of the operation and checks for errors.
*/
private void checkResults(RestConnection connection, JobInfo job,
List<BatchInfo> batchInfoList)
throws AsyncApiException, IOException {
// batchInfoList was populated when batches were created and submitted
for (BatchInfo b : batchInfoList) {
CSVReader rdr =
new CSVReader(connection.getBatchResultStream(job.getId(), b.getId()));
List<String> resultHeader = rdr.nextRecord();
int resultCols = resultHeader.size();
List<String> row;
while ((row = rdr.nextRecord()) != null) {
64
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
/**
* Wait for a job to complete by polling the Bulk API.
*
* @param connection
* RestConnection used to check results.
* @param job
* The job awaiting completion.
* @param batchInfoList
* List of batches for this job.
* @throws AsyncApiException
*/
private void awaitCompletion(RestConnection connection, JobInfo job,
List<BatchInfo> batchInfoList)
throws AsyncApiException {
long sleepTime = 0L;
Set<String> incomplete = new HashSet<String>();
for (BatchInfo bi : batchInfoList) {
incomplete.add(bi.getId());
}
while (!incomplete.isEmpty()) {
try {
Thread.sleep(sleepTime);
} catch (InterruptedException e) {}
System.out.println("Awaiting results..." + incomplete.size());
sleepTime = 10000L;
BatchInfo[] statusList =
connection.getBatchInfoList(job.getId()).getBatchInfo();
for (BatchInfo b : statusList) {
if (b.getState() == BatchStateEnum.Completed
|| b.getState() == BatchStateEnum.Failed) {
if (incomplete.remove(b.getId())) {
System.out.println("BATCH STATUS:\n" + b);
}
}
}
}
}
65
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
/**
* Create a new job using the Bulk API.
*
* @param sobjectType
* The object type being loaded, such as "Account"
* @param connection
* RestConnection used to create the new job.
* @return The JobInfo for the new job.
* @throws AsyncApiException
*/
private JobInfo createJob(String sobjectType, RestConnection connection)
throws AsyncApiException {
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(OperationEnum.insert);
job.setContentType(ContentType.CSV);
job = connection.createJob(job);
System.out.println(job);
return job;
}
/**
* Create the RestConnection used to call Bulk API operations.
*/
private RestConnection getRestConnection(String userName, String password)
throws ConnectionException, AsyncApiException {
ConnectorConfig partnerConfig = new ConnectorConfig();
partnerConfig.setUsername(userName);
partnerConfig.setPassword(password);
partnerConfig.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/21.0");
/**
* Create and upload batches using a CSV file.
* The file into the appropriate size batch files.
*
* @param connection
* Connection to use for creating batches
66
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
* @param jobInfo
* Job associated with new batches
* @param csvFileName
* The source file for batch data
*/
private List<BatchInfo> createBatchesFromCSVFile(RestConnection connection,
JobInfo jobInfo, String csvFileName)
throws IOException, AsyncApiException {
List<BatchInfo> batchInfos = new ArrayList<BatchInfo>();
BufferedReader rdr = new BufferedReader(
new InputStreamReader(new FileInputStream(csvFileName))
);
// read the CSV header row
byte[] headerBytes = (rdr.readLine() + "\n").getBytes("UTF-8");
int headerBytesLength = headerBytes.length;
File tmpFile = File.createTempFile("bulkAPIInsert", ".csv");
/**
* Create a batch by uploading the contents of the file.
* This closes the output stream.
*
* @param tmpOut
* The output stream used to write the CSV data for a single batch.
* @param tmpFile
* The file associated with the above stream.
* @param batchInfos
* The batch info for the newly created batch is added to this list.
* @param connection
* The RestConnection used to create the new batch.
67
Appendix A: Sample Client Application Using Java Walk Through the Sample Code
* @param jobInfo
* The JobInfo associated with the new batch.
*/
private void createBatch(FileOutputStream tmpOut, File tmpFile,
List<BatchInfo> batchInfos, RestConnection connection, JobInfo jobInfo)
throws IOException, AsyncApiException {
tmpOut.flush();
tmpOut.close();
FileInputStream tmpInputStream = new FileInputStream(tmpFile);
try {
BatchInfo batchInfo =
connection.createBatchFromStream(jobInfo, tmpInputStream);
System.out.println(batchInfo);
batchInfos.add(batchInfo);
} finally {
tmpInputStream.close();
}
}
68
Glossary
A |B |C |D |E |F |G |H |I |J |K |L |M |N |O |P |Q |R |S |T |U |V |W |X |Y |Z
A
Apex
Force.com Apex code is a strongly-typed, object-oriented programming language that allows developers to execute flow
and transaction control statements on the Force.com platform server in conjunction with calls to the Force.com API.
Using syntax that looks like Java and acts like database stored procedures, Apex code enables developers to add business
logic to most system events, including button clicks, related record updates, and Visualforce pages. Apex scripts can be
initiated by Web service requests and from triggers on objects.
App
Short for “application.” A collection of components such as tabs, reports, dashboards, and Visualforce pages that address
a specific business need. Salesforce provides standard apps such as Sales and Call Center. You can customize the standard
apps to match the way you work. In addition, you can package an app and upload it to the AppExchange along with
related components such as custom fields, custom tabs, and custom objects. Then, you can make the app available to other
Salesforce users from the AppExchange.
Asynchronous Calls
A call that does not return results immediately because the operation may take a long time. Calls in the Metadata API
and Bulk API are asynchronous.
B
Batch, Bulk API
A batch is a CSV or XML representation of a set of records in the Bulk API. You process a set of records by creating a
job that contains one or more batches. Each batch is processed independently by the server, not necessarily in the order
it is received. See Job, Bulk API.
Boolean Operators
You can use Boolean operators in report filters to specify the logical relationship between two values. For example, the
AND operator between two values yields search results that include both values. Likewise, the OR operator between two
values yields search results that include either value.
69
Glossary
Bulk API
The REST-based Bulk API is optimized for processing large sets of data. It allows you to insert, update, upsert, or delete
a large number of records asynchronously by submitting a number of batches which are processed in the background by
Salesforce. See also Web Services API.
C
Client App
An app that runs outside the Salesforce user interface and uses only the Force.com API or Bulk API. It typically runs on
a desktop or mobile device. These apps treat the platform as a data source, using the development model of whatever tool
and platform for which they are designed. See also Composite App and Native App.
Custom Field
A field that can be added in addition to the standard fields to customize Salesforce for your organization’s needs.
Custom Object
Custom records that allow you to store information unique to your organization.
D
Data Loader
A Force.com platform tool used to import and export data from your Salesforce organization.
Database
An organized collection of information. The underlying architecture of the Force.com platform includes a database where
your data is stored.
Database Table
A list of information, presented with rows and columns, about the person, thing, or concept you want to track. See also
Object.
Decimal Places
Parameter for number, currency, and percent custom fields that indicates the total number of digits you can enter to the
right of a decimal point, for example, 4.98 for an entry of 2. Note that the system rounds the decimal numbers you enter,
if necessary. For example, if you enter 4.986 in a field with Decimal Places of 2, the number rounds to 4.99.
Dependent Field
Any custom picklist or multi-select picklist field that displays available values based on the value selected in its corresponding
controlling field.
Developer Edition
A free, fully-functional Salesforce organization designed for developers to extend, integrate, and develop with the Force.com
platform. Developer Edition accounts are available on developer.force.com.
Developer Force
The Developer Force website at developer.force.com provides a full range of resources for platform developers, including
sample code, toolkits, an online developer community, and the ability to obtain limited Force.com platform environments.
70
Glossary
E
Enterprise Edition
A Salesforce edition designed for larger, more complex businesses.
F
Field
A part of an object that holds a specific piece of information, such as a text or currency value.
Field-Level Security
Settings that determine whether fields are hidden, visible, read only, or editable for users based on their profiles. Available
in Enterprise, Unlimited, and Developer Editions only.
Force.com
The salesforce.com platform for building applications in the cloud. Force.com combines a powerful user interface, operating
system, and database to allow you to customize and deploy applications in the cloud for your entire enterprise.
Force.com IDE
An Eclipse plug-in that allows developers to manage, author, debug and deploy Force.com applications in the Eclipse
development environment.
Foreign key
A field whose value is the same as the primary key of another table. You can think of a foreign key as a copy of a primary
key from another table. A relationship is made between two tables by matching the values of the foreign key in one table
with the values of the primary key in another.
Formula Field
A type of custom field. Formula fields automatically calculate their values based on the values of merge fields, expressions,
or other values.
Function
Built-in formulas that you can customize with input parameters. For example, the DATE function creates a date field
type from a given year, month, and day.
G
Gregorian Year
A calendar based on a twelve month structure used throughout much of the world.
H
No Glossary items for this entry.
71
Glossary
I
ID
See Salesforce Record ID.
Instance
The cluster of software and hardware represented as a single logical server that hosts an organization's data and runs their
applications. The Force.com platform runs on multiple instances, but data for any single organization is always consolidated
on a single instance.
Integration User
A Salesforce user defined solely for client apps or integrations. Also referred to as the logged-in user in a Web services
API context.
ISO Code
The International Organization for Standardization country code, which represents each country by two letters.
J
Job, Bulk API
A job in the Bulk API specifies which object is being processed (for example, Account, Opportunity) and what type of
action is being used (insert, upsert, update, or delete). You process a set of records by creating a job that contains one or
more batches. See Batch, Bulk API.
K
No Glossary items for this entry.
L
Locale
The country or geographic region in which the user is located. The setting affects the format of date and number fields,
for example, dates in the English (United States) locale display as 06/30/2000 and as 30/06/2000 in the English (United
Kingdom) locale.
In Professional, Enterprise, Unlimited, and Developer Edition organizations, a user’s individual Locale setting overrides
the organization’s Default Locale setting. In Personal and Group Editions, the organization-level locale field is called
Locale, not Default Locale.
Logged-in User
In a Web services API context, the username used to log into Salesforce. Client applications run with the permissions
and sharing of the logged-in user. Also referred to as an integration user.
Lookup Field
A type of field that contains a linkable value to another record. You can display lookup fields on page layouts where the
object has a lookup or master-detail relationship with another object. For example, cases have a lookup relationship with
assets that allows users to select an asset using a lookup dialog from the case edit page and click the name of the asset from
the case detail page.
72
Glossary
M
Managed Package
A collection of application components that is posted as a unit on the AppExchange and associated with a namespace and
possibly a License Management Organization. To support upgrades, a package must be managed. An organization can
create a single managed package that can be downloaded and installed by many different organizations. Managed packages
differ from unmanaged packages by having some locked components, allowing the managed package to be upgraded later.
Unmanaged packages do not include locked components and cannot be upgraded. In addition, managed packages obfuscate
certain components (like Apex) on subscribing organizations to protect the intellectual property of the developer.
Manual Sharing
Record-level access rules that allow record owners to give read and edit permissions to other users who might not have
access to the record any other way.
Many-to-Many Relationship
A relationship where each side of the relationship can have many children on the other side. Many-to-many relationships
are implemented through the use of junction objects.
Master-Detail Relationship
A relationship between two different types of records that associates the records with each other. For example, accounts
have a master-detail relationship with opportunities. This type of relationship affects record deletion, security, and makes
the lookup relationship field required on the page layout.
Metadata
Information about the structure, appearance, and functionality of an organization and any of its parts. Force.com uses
XML to describe metadata.
Multitenancy
An application model where all users and apps share a single, common infrastructure and code base.
N
Native App
An app that is built exclusively with setup (metadata) configuration on Force.com. Native apps do not require any external
services or infrastructure.
O
Object
An object allows you to store information in your Salesforce organization. The object is the overall definition of the type
of information you are storing. For example, the case object allow you to store information regarding customer inquiries.
For each object, your organization will have multiple records that store the information about specific instances of that
type of data. For example, you might have a case record to store the information about Joe Smith's training inquiry and
another case record to store the information about Mary Johnson's configuration issue.
Object-Level Security
Settings that allow an administrator to hide whole tabs and objects from a user so that he or she does not know that type
of data exists. On the platform you set object-level access rules with object permissions on user profiles.
One-to-Many Relationship
A relationship in which a single object is related to many other objects. For example, an account may have one or more
related contacts.
73
Glossary
Organization-Wide Defaults
Settings that allow you to specify the baseline level of data access that a user has in your organization. For example, you
can make it so that any user can see any record of a particular object that is enabled in their user profile, but that they need
extra permissions to edit one.
Outbound Message
An outbound message is a workflow, approval, or milestone action that sends the information you specify to an endpoint
you designate, such as an external service. An outbound message sends the data in the specified fields in the form of a
SOAP message to the endpoint. Outbound messaging is configured in the Salesforce setup menu. Then you must configure
the external endpoint. You can create a listener for the messages using the Web services API.
Owner
Individual user to which a record (for example, a contact or case) is assigned.
P
Package
A group of Force.com components and applications that are made available to other organizations through the
AppExchange. You use packages to bundle an app along with any related components so that you can upload them to
AppExchange together.
Parent Account
An organization or company that an account is affiliated. By specifying a parent for an account, you can get a global view
of all parent/subsidiary relationships using the View Hierarchy link.
Picklist
Selection list of options available for specific fields in a Salesforce object, for example, the Industry field for accounts.
Users can choose a single value from a list of options rather than make an entry directly in the field. See also Master
Picklist.
Picklist (Multi-Select)
Selection list of options available for specific fields in a Salesforce object. Multi-select picklists allow users to choose one
or more values. Users can choose a value by double clicking on it, or choose additional values from a scrolling list by holding
down the CTRL key while clicking a value and using the arrow icon to move them to the selected box.
Picklist Values
Selections displayed in drop-down lists for particular fields. Some values come predefined, and other values can be changed
or defined by an administrator.
Platform Edition
A Salesforce edition based on either Enterprise Edition or Unlimited Edition that does not include any of the standard
Salesforce CRM apps, such as Sales or Service & Support.
Primary Key
A relational database concept. Each table in a relational database has a field in which the data value uniquely identifies
the record. This field is called the primary key. The relationship is made between two tables by matching the values of
the foreign key in one table with the values of the primary key in another.
Production Organization
A Salesforce organization that has live users accessing data.
Professional Edition
A Salesforce edition designed for businesses who need full-featured CRM functionality.
74
Glossary
Q
Query String Parameter
A name-value pair that's included in a URL, typically after a '?' character. For example:
http://na1.salesforce.com/001/e?name=value
R
Record
A single instance of a Salesforce object. For example, “John Jones” might be the name of a contact record.
Record Name
A standard field on all Salesforce objects. Whenever a record name is displayed in a Force.com application, the value is
represented as a link to a detail view of the record. A record name can be either free-form text or an autonumber field.
Record Name does not have to be a unique value.
Record Type
A field available for certain records that can include some or all of the standard and custom picklist values for that record.
Record types are special fields that you can associate with profiles to make only the included picklist values available to
users with that profile.
Record-Level Security
A method of controlling data in which you can allow a particular user to view and edit an object, but then restrict the
records that the user is allowed to see.
Recycle Bin
A page that lets you view and restore deleted information. Access the Recycle Bin by using the link in the sidebar.
Related Object
Objects chosen by an administrator to display in the Console tab's mini view when records of a particular type are shown
in the console's detail view. For example, when a case is in the detail view, an administrator can choose to display an
associated account, contact, or asset in the mini view.
Relationship
A connection between two objects, used to create related lists in page layouts and detail levels in reports. Matching values
in a specified field in both objects are used to link related data; for example, if one object stores data about companies and
another object stores data about people, a relationship allows you to find out which people work at the company.
Relationship Query
In a SOQL context, a query that traverses the relationships between objects to identify and return results. Parent-to-child
and child-to-parent syntax differs in SOQL queries.
Role Hierarchy
A record-level security setting that defines different levels of users such that users at higher levels can view and edit
information owned by or shared with users beneath them in the role hierarchy, regardless of the organization-wide sharing
model settings.
75
Glossary
Running User
Each dashboard has a running user, whose security settings determine which data to display in a dashboard. If the running
user is a specific user, all dashboard viewers see data based on the security settings of that user—regardless of their own
personal security settings. For dynamic dashboards, you can set the running user to be the logged-in user, so that each
user sees the dashboard according to his or her own access level.
S
SaaS
See Software as a Service (SaaS).
Sandbox Organization
A nearly identical copy of a Salesforce production organization. You can create multiple sandboxes in separate environments
for a variety of purposes, such as testing and training, without compromising the data and applications in your production
environment.
Session ID
An authentication token that is returned when a user successfully logs in to Salesforce. The Session ID prevents a user
from having to log in again every time he or she wants to perform another action in Salesforce. Different from a record
ID or Salesforce ID, which are terms for the unique ID of a Salesforce record.
Session Timeout
The period of time after login before a user is automatically logged out. Sessions expire automatically after a predetermined
length of inactivity, which can be configured in Salesforce by clicking Your Name ➤ Setup ➤ Security Controls. The
default is 120 minutes (two hours). The inactivity timer is reset to zero if a user takes an action in the Web interface or
makes an API call.
Setup
An administration area where you can customize and define Force.com applications. Access Setup through the Your
Name ➤ Setup link at the top of Salesforce pages.
Sharing
Allowing other users to view or edit information you own. There are different ways to share data:
• Sharing Model—defines the default organization-wide access levels that users have to each other’s information and
whether to use the hierarchies when determining access to data.
• Role Hierarchy—defines different levels of users such that users at higher levels can view and edit information owned
by or shared with users beneath them in the role hierarchy, regardless of the organization-wide sharing model settings.
• Sharing Rules—allow an administrator to specify that all information created by users within a given group or role is
automatically shared to the members of another group or role.
• Manual Sharing—allows individual users to share a specific account or opportunity with other users or groups.
• Apex-Managed Sharing—enables developers to programmatically manipulate sharing to support their application’s
behavior. See Apex-Managed Sharing.
Sharing Model
Behavior defined by your administrator that determines default access by users to different types of records.
76
Glossary
Sharing Rule
Type of default sharing created by administrators. Allows users in a specified group or role to have access to all information
created by users within a given group or role.
Standard Object
A built-in object included with the Force.com platform. You can also build custom objects to store information that is
unique to your app.
T
Translation Workbench
The Translation Workbench lets you specify languages you want to translate, assign translators to languages, create
translations for customizations you’ve made to your Salesforce organization, and override labels and translations from
managed packages. Everything from custom picklist values to custom fields can be translated so your global users can use
all of Salesforce in their language.
U
Unlimited Edition
Unlimited Edition is salesforce.com's flagship solution for maximizing CRM success and extending that success across
the entire enterprise through the Force.com platform.
Unmanaged Package
A package that cannot be upgraded or controlled by its developer.
V
Visualforce
A simple, tag-based markup language that allows developers to easily define custom pages and components for apps built
on the platform. Each tag corresponds to a coarse or fine-grained component, such as a section of a page, a related list,
or a field. The components can either be controlled by the same logic that is used in standard Salesforce pages, or developers
can associate their own logic with a controller written in Apex.
77
Glossary
W
Web Service
A mechanism by which two applications can easily exchange data over the Internet, even if they run on different platforms,
are written in different languages, or are geographically remote from each other.
X
XML (Extensible Markup Language)
A markup language that enables the sharing and transportation of structured data. All Force.com components that are
retrieved or deployed through the Metadata API are represented by XML definitions.
Y
No Glossary items for this entry.
Z
No Glossary items for this entry.
78
Index
Index
A E
Abort job 33 Error codes 52
Add batch 8, 36 Errors
Attachments 14, 22 batch results 43
External ID 16
B
F
Batch lifespan 34
Batches 35 Finding field names 15
adding to a job 8, 36
checking status 9
creating 36
G
getting a request 41 Get job details 32
getting information 38 Get job status and results 32
getting information for job 39 Getting started sample 5, 57
getting results 42 Guidelines 12
handling failed rows 43
interpreting state 40
monitoring 37 H
retrieving results 10
Binary attachments HTTP requests 5, 6
batch file 23 HTTP status codes 52
creating a batch 25
creating a job 24 I
introduction 22
manifest file 23 Introduction 3
zip file 24
J
C
Java client application 57
Checking batch status 9 Job lifespan 34
Client application 57 Jobs 29, 46
Close job 9, 31 aborting 33
Code sample closing 9, 31
setting up organization 5, 57 creating 7, 30
setting up your client 57 getting details 32
walk through sample code 58 monitoring 31
Create batch 8, 36
Create job 7, 30
CSV
L
external ID 16 Large data volumes 3
null values 17 Limits 54
polymorphic field 16 Lock contention 12
record rows 17 Logging in 28
relationship field 16 Login 6
sample file 17 Longevity of jobs and batches 34
CSV data files 16
cURL 6
M
D Monitor batch 37
Monitor job 31
Data files
CSV 16
date format 15 O
finding field names 15
introduction 14 Operations
XML 18 on jobs 29
Date format 15
79
Index
P Schema 46
Session header 28
Planning 11 SOAP 6
Polymorphic field 16, 18 State
interpreting for batch 40
Status
Q get for job 32
Quick start 5 getting for batch 38
cURL 5 getting for batches 39
setting up organization 5, 57 interpreting for batch 40
setting up your client 5 Status codes 52
R U
Reference 45 URI 28
batches (BatchInfo) 50
error codes 52 W
HTTP status codes 52
jobs ( JobInfo) 46 WSC 57
Request WSDL 6
getting for batch 41
Resource 28
Results
X
getting for batch 42 XML
handling failed rows 43 null values 20
Retrieving batch results 10 polymorphic field 18
records 20
S relationship field 18
sample file 21
Sample code 57 XML data files 18
Sample for quick start 5 XSD 46
Sample job operations 29
80