Skip to main content
The Audit Log API allows customers to query audit logs for their workspace. These APIs cover ways to query audit logs at a given time or from an ID and how to paginate through audit logs over time.
The current API request rate limits for Audit Log API endpoints are 60 requests per minute.
GET /api/v1/logs/audit/search
GET /api/v1/logs/audit/earliest
GET /api/v1/logs/audit/latest
GET /api/v1/logs/audit

Overview

Use the Audit Log API to:
  • Track User Activity: Monitor all actions taken by users in your workspace
  • Maintain Compliance: Keep detailed records for audit and compliance purposes
  • Investigate Incidents: Retrieve historical logs to investigate specific events
  • Export Records: Fetch and store audit logs in your own systems

Endpoints

Search Audit Logs

GET /api/v1/logs/audit/search
Returns the audit log at or after a specified timestamp. Useful to begin pagination from a specific point in time. Parameters:
  • time (required): UTC epoch timestamp, up to 1 year old from now
Example:
curl -X GET "https://api.harvey.ai/api/v1/logs/audit/search?time=1712066546" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json"

Get Earliest Audit Log

GET /api/v1/logs/audit/earliest
Returns the earliest audit log for the workspace. Useful to begin pagination from the very beginning. This endpoint takes no parameters. Example:
curl -X GET "https://api.harvey.ai/api/v1/logs/audit/earliest" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json"

Get Latest Audit Log

GET /api/v1/logs/audit/latest
Returns the latest audit log for the workspace. Useful to begin pagination from the most recent event. This endpoint takes no parameters. Example:
curl -X GET "https://api.harvey.ai/api/v1/logs/audit/latest" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json"

Query Audit Logs

GET /api/v1/logs/audit
Paginates forward in time from a given audit log ID. Use this endpoint to fetch multiple log entries at once. Parameters:
  • from (required): Audit log ID to begin fetching from (UUID format)
  • take (required): Number of audit log entries to fetch (max 1000)
Example:
curl -X GET "https://api.harvey.ai/api/v1/logs/audit?from=018e983f-d10f-72aa-9c94-d8263e53c6a4&take=100" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json"

Audit Log Types

Each audit log entry includes a type field that identifies the specific action performed. Below is a complete list of audit log types you may encounter:

API Operations

Audit Log TypeDescription
api:audit_log_fetchFetches audit logs via API
api:create_queryCreates a query via API
api:client_matter_managementManages client matters via API
api:history_fetchFetches history records via API
api:token_mgmt_rotateRotates API tokens
api:vault_list_projectsLists vault projects via API
api:vault_get_metadataGets vault metadata via API
api:vault_upload_filesUploads files to vault via API
api:vault_delete_fileDeletes files from vault via API
api:vault_delete_projectDeletes vault projects via API

Authentication

Audit Log TypeDescription
auth:loginUser login
auth:logoutUser logout
auth:failedFailed authentication attempt

Admin Operations

Audit Log TypeDescription
admin:client_view_workspace_historyAdmin views workspace history in client
admin:fetch_workspace_historyAdmin fetches workspace history
admin:client_view_workspace_history_itemAdmin views specific workspace history item in client
admin:fetch_workspace_history_itemAdmin fetches specific workspace history item
admin:delete_workspace_history_itemAdmin deletes workspace history item
admin:delete_workspace_history_itemsAdmin deletes multiple workspace history items
admin:export_workspace_historyAdmin exports workspace history
admin:export_query_usageAdmin exports query usage data
admin:client_export_workspace_usersAdmin exports workspace users from client
admin:add_usersAdmin adds users
admin:remove_usersAdmin removes users
admin:grant_permsAdmin grants permissions
admin:revoke_permsAdmin revokes permissions
admin:create_roleAdmin creates role
admin:update_roleAdmin updates role
admin:delete_roleAdmin deletes role
admin:update_user_roleAdmin updates user role
admin:create_export_templateAdmin creates export template
admin:edit_export_templateAdmin edits export template
admin:delete_export_templateAdmin deletes export template
admin:download_export_templateAdmin downloads export template
admin:bulk_update_role_configsAdmin bulk updates role configurations
admin:get_role_usersAdmin gets users for a role
admin:fetch_client_mattersAdmin fetches client matters
admin:add_client_mattersAdmin adds client matters
admin:delete_client_mattersAdmin deletes client matters
admin:enable_integrationAdmin enables integration
admin:disable_integrationAdmin disables integration
admin:update_integrationAdmin updates integration
admin:update_client_mattersAdmin updates client matters
admin:manage_client_mattersAdmin manages client matters
admin:fetch_statsAdmin fetches statistics
admin:fetch_workspace_usersAdmin fetches workspace users
admin:update_sharing_settingsAdmin updates sharing settings
admin:upsert_workspace_noticeAdmin creates or updates workspace notice
admin:upsert_workspace_guidanceAdmin creates or updates workspace guidance
admin:upsert_workspace_logoAdmin creates or updates workspace logo
admin:delete_workspace_logoAdmin deletes workspace logo
admin:update_workspace_brand_nameAdmin updates workspace brand name
admin:edit_user_profileAdmin edits user profile
admin:update_playbook_permissionsAdmin updates playbook permissions
admin:publish_playbookAdmin publishes playbook
admin:unpublish_playbookAdmin unpublishes playbook

User Operations

Audit Log TypeDescription
user:create_queryUser creates query
user:client_view_historyUser views history in client
user:fetch_historyUser fetches history
user:client_view_history_itemUser views specific history item in client
user:fetch_history_itemUser fetches specific history item
user:update_history_itemUser updates history item
user:delete_history_itemUser deletes history item
user:cancel_history_itemUser cancels history item
user:fetch_client_mattersUser fetches client matters
user:add_client_mattersUser adds client matters
user:delete_client_mattersUser deletes client matters
user:update_client_mattersUser updates client matters
user:fetch_vault_top_level_foldersUser fetches vault top-level folders
user:fetch_vault_example_projectUser fetches vault example project
user:set_vault_example_projectUser sets vault example project
user:unset_vault_example_projectUser unsets vault example project
user:fetch_vault_folder_pathUser fetches vault folder path
user:fetch_project_metadataUser fetches project metadata
user:fetch_vault_folderUser fetches vault folder
user:fetch_vault_fileUser fetches vault file
user:fetch_vault_file_review_rowsUser fetches vault file review rows
user:fetch_vault_filesUser fetches vault files
user:fetch_vault_folders_by_pathUser fetches vault folders by path
user:create_vault_folderUser creates vault folder
user:create_vault_review_queryUser creates vault review query
user:upload_vault_filesUser uploads vault files
user:update_vault_file_metadataUser updates vault file metadata
user:update_vault_folder_metadataUser updates vault folder metadata
user:delete_vault_filesUser deletes vault files
user:delete_vault_folderUser deletes vault folder
user:fetch_query_questionsUser fetches query questions
user:semantic_search_with_vault_folderUser performs semantic search with vault folder
user:retry_vault_filesUser retries vault files
user:rerun_vault_review_queriesUser reruns vault review queries
user:mark_review_event_completedUser marks review event as completed
user:clear_vault_query_errorsUser clears vault query errors
user:fetch_vault_review_query_usageUser fetches vault review query usage
user:fetch_vault_folder_history_statsUser fetches vault folder history stats
user:fetch_vault_projects_history_statsUser fetches vault projects history stats
user:create_vault_folder_sharing_permissionsUser creates vault folder sharing permissions
user:update_vault_folder_sharing_permissionsUser updates vault folder sharing permissions
user:delete_vault_folder_sharing_permissionsUser deletes vault folder sharing permissions
user:create_event_sharing_permissionsUser creates event sharing permissions
user:update_event_sharing_permissionsUser updates event sharing permissions
user:create_library_itemUser creates library item
user:update_library_itemUser updates library item
user:delete_library_itemUser deletes library item
user:connect_integrationUser connects integration
user:disconnect_integrationUser disconnects integration
user:fetch_connected_integrationsUser fetches connected integrations
user:fetch_integration_tokenUser fetches integration token
user:export_libraryUser exports library
user:accept_workspace_noticeUser accepts workspace notice
user:enable_workspace_featureUser enables workspace feature
user:disable_workspace_featureUser disables workspace feature
user:view_dms_one_way_syncUser views DMS one-way sync
user:create_dms_one_way_syncUser creates DMS one-way sync
user:trigger_dms_one_way_syncUser triggers DMS one-way sync
user:update_dms_one_way_syncUser updates DMS one-way sync
user:delete_dms_one_way_syncUser deletes DMS one-way sync
user:dms_folder_uploadUser uploads folder via DMS
user:bulk_patch_resource_accessUser bulk patches resource access
user:revoke_resource_accessUser revokes resource access
user:list_resource_accessUser lists resource access
user:dms_file_importUser imports file from DMS
user:dms_file_exportUser exports file to DMS
user:add_user_profileUser adds user profile
user:edit_user_profileUser edits user profile
user:fetch_vault_history_itemUser fetches vault history item
user:review_playbook_documentUser reviews playbook document
user:create_playbookUser creates playbook
user:create_user_groupUser creates user group
user:add_user_group_membersUser adds user group members
user:remove_user_group_membersUser removes user group members
user:get_user_group_membersUser gets user group members
user:get_user_groupUser gets user group
user:list_user_groupsUser lists user groups
user:delete_user_groupUser deletes user group
user:delete_playbookUser deletes playbook
user:update_playbookUser updates playbook
user:convert_playbook_documentUser converts playbook document
user:fetch_playbook_permissionsUser fetches playbook permissions
user:duplicate_playbookUser duplicates playbook
user:export_playbook_reviewUser exports playbook review
user:export_playbookUser exports playbook
user:list_playbooksUser lists playbooks
user:fetch_playbook_historyUser fetches playbook history
user:fetch_playbook_versionUser fetches playbook version

System Operations

Audit Log TypeDescription
system:trigger_dms_one_way_syncSystem triggers DMS one-way sync

Use Cases

Use Case 1: Compliance Monitoring and Audit Trail Capture

Challenge: Organizations must maintain detailed records of all user activity for compliance and regulatory requirements. Solution: Regularly fetch and store audit logs using the pagination workflow. Each log entry includes the user, timestamp, IP address, and action type.

Use Case 2: Incident Response and User Investigations

Challenge: When investigating a security incident, teams need to reconstruct what happened during a specific time period. Solution: Use /search?time=<timestamp> to start from a specific point in time, then paginate through subsequent logs to track all activity during the incident window.

Use Case 3: Continuous Monitoring

Challenge: Security teams need to monitor recent activity in near real-time. Solution: Periodically poll /latest to get the most recent log entry, then use /audit?from=<last_seen_id>&take=100 to fetch any new logs since the last check.

Regular Cadence Fetching

For continuous monitoring and compliance requirements, you’ll want to fetch audit logs on a regular schedule. Here’s the recommended approach:

Initial Backfill

If you’re setting up audit log collection for the first time:
import requests
import time

token = "YOUR_API_KEY"
headers = {
    "Content-Type": "application/json",
    "Authorization": f"Bearer {token}"
}

# Start from the earliest log
response = requests.get("https://api.harvey.ai/api/v1/logs/audit/earliest", headers=headers)
current_log = response.json()["log"]

# Save to your database
save_to_database([current_log])
last_processed_id = current_log["id"]

# Paginate through all historical logs
while True:
    time.sleep(1)  # Respect 60 req/min rate limit
    
    response = requests.get(
        f"https://api.harvey.ai/api/v1/logs/audit?from={last_processed_id}&take=1000",
        headers=headers
    )
    
    logs = response.json()
    if not logs:
        break
    
    save_to_database(logs)
    last_processed_id = logs[-1]["id"]
    print(f"Processed {len(logs)} logs, last ID: {last_processed_id}")

# Store checkpoint in database
save_checkpoint(last_processed_id)

Incremental Updates

After your initial backfill, run this on a regular schedule (e.g., every 5-15 minutes):
import requests
import time

token = "YOUR_API_KEY"
headers = {
    "Content-Type": "application/json",
    "Authorization": f"Bearer {token}"
}

# Load the last processed ID from your database
last_processed_id = load_checkpoint()

# Fetch new logs since last check
response = requests.get(
    f"https://api.harvey.ai/api/v1/logs/audit?from={last_processed_id}&take=1000",
    headers=headers
)

new_logs = response.json()
if new_logs:
    save_to_database(new_logs)
    
    # Update the checkpoint
    last_processed_id = new_logs[-1]["id"]
    save_checkpoint(last_processed_id)
    
    print(f"Fetched {len(new_logs)} new logs")
else:
    print("No new logs since last check")

Scheduling Recommendations

  • High-activity workspaces: Poll every 5-10 minutes with take=1000
  • Medium-activity workspaces: Poll every 15-30 minutes with take=500
  • Low-activity workspaces: Poll hourly with take=100

Key Considerations

  • Persistent storage: Always save the last processed log ID to disk/database so your process can resume after restarts.
  • Idempotency: Audit logs are immutable, so it’s safe to reprocess the same log multiple times if needed.
  • Error handling: If a fetch fails, retry from the same log ID and don’t skip ahead.
  • Rate limiting: With the 60 req / min, fetching 1000 logs takes approximately 1 second per batch. Plan your cadence accordingly.
  • Gap detection: Monitor timestamps to detect if you’re falling behind. If the latest fetched timestamp is more than your polling interval old, increase frequency or batch size.

Best Practices

  • Respect rate limits: The API is limited to 60 requests per minute. Implement appropriate delays in your polling logic
  • Store logs externally: Export audit logs to your own SIEM or audit repository for long-term retention and analysis
  • Handle timestamps correctly: The time parameter uses UTC epoch timestamps (seconds since January 1, 1970)
  • Track pagination state: Always save the last processed log ID to persistent storage to resume pagination if your process is interrupted
  • Monitor for new event types: The API may add new event types over time, so build your parsing logic to handle unknown types gracefully
  • Implement retries with backoff: If you hit rate limits or encounter errors, implement exponential backoff before retrying
  • Deduplicate on ingestion: Use the log id field as a unique identifier to prevent duplicate storage if you need to reprocess logs
  • Set up alerting: Monitor your sync process to ensure logs are being fetched regularly and alert if the process fails

Error Handling

Status CodeDescriptionExample Error Message
200SuccessN/A
400Bad Request – Invalid query input{ "error": "Missing required filters" }
401Unauthorized – Invalid API key{ "error": "Unauthorized" }
429Too Many Requests – Rate limit exceeded{ "error": "Rate limit exceeded" }
500Internal Server Error{ "error": "Unexpected server error" }

Need help getting started? Contact your Harvey Customer Success Manager for more information.