Apache Airflow provider for FileMaker Cloud
Project description
FileMaker Cloud Provider for Apache Airflow
This is a custom provider package for Apache Airflow that enables integration with FileMaker Cloud's OData API.
Features
- FileMakerHook: Handles authentication with FileMaker Cloud through AWS Cognito and provides methods to interact with the OData API.
- FileMakerQueryOperator: Executes OData queries against FileMaker Cloud.
- FileMakerExtractOperator: Extracts data from FileMaker Cloud and saves it in various formats.
- FileMakerSchemaOperator: Retrieves and parses the FileMaker Cloud OData metadata schema.
- FileMakerDataSensor: Sensor that monitors FileMaker tables for specific conditions.
- FileMakerChangeSensor: Sensor that detects changes in FileMaker tables since a timestamp.
- FileMakerCustomSensor: Customizable sensor that allows for complex monitoring logic.
Installation
Installation from PyPI (Recommended)
pip install arktci-airflow-provider-filemaker
Provider Structure
This provider follows the official Apache Airflow provider structure:
providers/filemaker/
├── pyproject.toml
├── provider.yaml
├── setup.py
├── README.md
├── src/
│ └── airflow/
│ └── providers/filemaker/
│ ├── __init__.py
│ ├── hooks/
│ │ ├── __init__.py
│ │ └── filemaker.py
│ ├── operators/
│ │ ├── __init__.py
│ │ └── filemaker.py
│ ├── sensors/
│ │ ├── __init__.py
│ │ └── filemaker.py
│ └── auth/
│ ├── __init__.py
│ └── cognitoauth.py
└── tests/
├── unit/
│ └── filemaker/
│ ├── hooks/
│ │ └── test_filemaker.py
│ ├── operators/
│ │ └── test_filemaker.py
│ ├── sensors/
│ │ └── test_filemaker.py
│ └── auth/
│ └── test_cognitoauth.py
├── integration/
│ └── filemaker/
│ └── test_integration_filemaker.py
└── system/
└── filemaker/
└── example_filemaker.py
Manual Installation
-
Copy the
providers/filemakerdirectory to your Airflow project'sprovidersdirectory. -
Install the required dependencies:
pip install -r requirements.txt -
Install the package:
pip install -e . -
Create a FileMaker connection in Airflow:
- Connection ID:
filemaker_default(or any ID you prefer) - Connection Type:
filemaker - Host: Your FileMaker Cloud host (e.g.,
my-fmcloud.filemaker-cloud.com) - Schema: Your FileMaker database name
- Login: Your FileMaker Cloud username (Claris ID)
- Password: Your FileMaker Cloud password
- Extra: JSON containing Cognito details (if not using auto-discovery):
{ "user_pool_id": "your-cognito-user-pool-id", "client_id": "your-cognito-client-id", "region": "your-aws-region" }
- Connection ID:
Connection Testing
The FileMaker provider supports connection testing in the Airflow UI. To test your connection:
- After configuring your connection as described above, click the Test button.
- The provider will attempt to authenticate with your FileMaker Cloud instance using the provided credentials.
- You'll receive feedback indicating whether the connection was successful or if there were any issues.
The test verifies:
- All required connection parameters are provided (host, database, username, password)
- Authentication with FileMaker Cloud is successful
- A valid token can be retrieved
Common error messages and their solutions:
- Missing FileMaker host/database/username/password: Ensure all required fields are filled in.
- Failed to retrieve authentication token: Verify your credentials are correct.
- Connection failed: Check network connectivity to your FileMaker server.
For more detailed information about setting up connections, refer to the connections documentation.
Development
Setting Up Development Environment
-
Clone the repository:
git clone https://github.com/your-org/airflow-dev.git cd airflow-dev/providers/filemaker
-
Install development requirements:
pip install -r requirements-dev.txt
-
Set up your credentials for integration tests:
- Copy
.env.templateto.env - Fill in your FileMaker Cloud credentials
- Copy
cp .env.template .env
# Edit .env with your credentials
Running Tests
Unit Tests
Unit tests can be run without FileMaker Cloud credentials:
pytest tests/unit
Integration Tests
Integration tests require valid FileMaker Cloud credentials.
Important Note on Authentication: The integration tests use AWS Cognito authentication with the /fmi/odata/login/info endpoint. If your FileMaker Cloud instance uses a different authentication method or if this endpoint is not accessible, the tests will fail with a 400 Bad Request error. You may need to modify the authentication code to match your FileMaker Cloud version's requirements.
To run integration tests:
-
Ensure your
.envfile is properly configured with:- FILEMAKER_HOST (must include https:// protocol)
- FILEMAKER_DATABASE
- FILEMAKER_USERNAME
- FILEMAKER_PASSWORD
-
Run the tests:
./run_tests.sh
Or run specific tests:
python -m pytest tests/integration -v
Continuous Integration
For CI/CD with GitHub Actions, add the following secrets to your repository:
- FILEMAKER_HOST
- FILEMAKER_DATABASE
- FILEMAKER_USERNAME
- FILEMAKER_PASSWORD
This will allow integration tests to run during the release workflow.
Troubleshooting
Integration Test Authentication Failures
If you're seeing authentication errors like 400 Client Error: Bad Request for url: https://your-host/fmi/odata/login/info, it may indicate:
- Your FileMaker Cloud instance is using a different authentication endpoint
- The FileMaker Cloud version has updated its API format
- Your credentials don't have the necessary permissions
Check your FileMaker Cloud documentation for the correct API endpoint format and authentication requirements.
License
This provider is licensed under the same license as Apache Airflow.
Important Note: If you use placeholder values like "your-filemaker-host.com" in your .env file, the integration tests will attempt to run but will fail with connection errors. This is expected behavior since the placeholders are not valid hosts. You need real FileMaker Cloud credentials for the integration tests to pass.
To run only specific tests, you can use pytest directly:
# Run only unit tests
python -m pytest tests/unit -v
# Run only integration tests
python -m pytest tests/integration -v
# Run a specific test file
python -m pytest tests/unit/filemaker/hooks/test_filemaker.py -v
When running integration tests in a CI/CD environment, you'll need to securely provide credentials:
-
In your GitHub repository, go to Settings > Secrets and add the following secrets:
FILEMAKER_HOST(include the https:// protocol)FILEMAKER_DATABASEFILEMAKER_USERNAMEFILEMAKER_PASSWORD
-
These secrets will be automatically used by the GitHub Actions workflow when running integration tests.
Authentication Information
OData Authentication Details
This provider implements authentication using AWS Cognito with the following fixed credentials (same as the JavaScript implementation):
- User Pool ID:
us-west-2_NqkuZcXQY - Client ID:
4l9rvl4mv5es1eep1qe97cautn
The authentication header format used is:
Authorization: FMID <token>
This matches the implementation in JavaScript examples like:
// JavaScript example
const response = await fetch(baseUrl, {
headers: {
'Authorization': `FMID ${fmidToken}`,
'Accept': 'application/json'
}
});
This authentication implementation:
- Uses a standard AWS Cognito authentication flow (
USER_PASSWORD_AUTH) - Retrieves an ID token for your FileMaker credentials
- Uses the ID token with the "FMID" prefix in the Authorization header
Authentication Troubleshooting
If you're experiencing authentication issues:
- Ensure your FileMaker host URL includes the protocol (
https://) - Verify your credentials are correct
- Check network logs to confirm the token is being sent correctly with the
FMIDprefix
The integration tests have been enhanced to provide detailed debugging information to help diagnose authentication issues.
Integration Tests
The integration tests for this provider require valid FileMaker Cloud credentials. The tests will authenticate with FileMaker Cloud and then run a series of tests to validate the functionality of the provider.
To run the integration tests:
-
Ensure your
.envfile is properly configured with:- FILEMAKER_HOST (must include https:// protocol)
- FILEMAKER_DATABASE
- FILEMAKER_USERNAME
- FILEMAKER_PASSWORD
-
Run the tests:
./run_tests.sh
Or run specific tests:
python -m pytest tests/integration -v
When running integration tests in a CI/CD environment, you'll need to securely provide credentials:
-
In your GitHub repository, go to Settings > Secrets and add the following secrets:
FILEMAKER_HOST(include the https:// protocol)FILEMAKER_DATABASEFILEMAKER_USERNAMEFILEMAKER_PASSWORD
-
These secrets will be automatically used by the GitHub Actions workflow when running integration tests.
Operators
FileMakerQueryOperator
This operator executes a query against a FileMaker database and returns the results.
from airflow.providers.filemaker.operators.filemaker import FileMakerQueryOperator
query_task = FileMakerQueryOperator(
task_id="query_filemaker",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
query={"Grade": "A"},
)
FileMakerExtractOperator
This operator extracts data from a FileMaker database and saves it to a file.
from airflow.providers.filemaker.operators.filemaker import FileMakerExtractOperator
extract_task = FileMakerExtractOperator(
task_id="extract_filemaker",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
query={"Grade": "A"},
destination_path="/tmp/students.json",
)
FileMakerSchemaOperator
This operator retrieves the schema for a FileMaker layout.
from airflow.providers.filemaker.operators.filemaker import FileMakerSchemaOperator
schema_task = FileMakerSchemaOperator(
task_id="get_schema",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
)
FileMakerCreateRecordOperator
This operator creates a new record in a FileMaker database.
from airflow.providers.filemaker.operators.filemaker import FileMakerCreateRecordOperator
create_task = FileMakerCreateRecordOperator(
task_id="create_record",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
record_data={
"FirstName": "John",
"LastName": "Doe",
"Email": "john.doe@example.com",
"Grade": "A",
},
)
FileMakerUpdateRecordOperator
This operator updates an existing record in a FileMaker database.
from airflow.providers.filemaker.operators.filemaker import FileMakerUpdateRecordOperator
update_task = FileMakerUpdateRecordOperator(
task_id="update_record",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
record_id="123", # Can also use XCom: "{{ ti.xcom_pull(task_ids='create_record')['recordId'] }}"
record_data={
"Grade": "A+",
"Notes": "Updated via Airflow",
},
)
FileMakerDeleteRecordOperator
This operator deletes a record from a FileMaker database.
from airflow.providers.filemaker.operators.filemaker import FileMakerDeleteRecordOperator
delete_task = FileMakerDeleteRecordOperator(
task_id="delete_record",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
record_id="123", # Can also use XCom: "{{ ti.xcom_pull(task_ids='create_record')['recordId'] }}"
)
FileMakerBulkCreateOperator
This operator creates multiple records in a FileMaker database in a single request.
from airflow.providers.filemaker.operators.filemaker import FileMakerBulkCreateOperator
bulk_create_task = FileMakerBulkCreateOperator(
task_id="bulk_create",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
records_data=[
{
"FirstName": "Jane",
"LastName": "Smith",
"Email": "jane.smith@example.com",
"Grade": "B+",
},
{
"FirstName": "Bob",
"LastName": "Johnson",
"Email": "bob.johnson@example.com",
"Grade": "C",
},
],
)
FileMakerExecuteFunctionOperator
This operator executes a FileMaker script/function.
from airflow.providers.filemaker.operators.filemaker import FileMakerExecuteFunctionOperator
execute_script_task = FileMakerExecuteFunctionOperator(
task_id="execute_script",
filemaker_conn_id="filemaker_default",
database="students",
layout="students",
script_name="UpdateGrades",
script_params={
"gradeThreshold": "C",
"newGrade": "C+",
},
)
FileMakerToS3Operator
This operator extracts data from a FileMaker database and uploads it to an S3 bucket.
from airflow.providers.filemaker.operators.filemaker import FileMakerToS3Operator
to_s3_task = FileMakerToS3Operator(
task_id="filemaker_to_s3",
filemaker_conn_id="filemaker_default",
aws_conn_id="aws_default",
database="students",
layout="students",
query={"Grade": "A"},
s3_bucket="my-bucket",
s3_key="data/students.json",
)
Example DAGs
The provider package includes example DAGs that demonstrate the usage of the FileMaker operators:
example_filemaker_query.py- Shows how to query data from FileMakerexample_filemaker_to_s3.py- Shows how to extract data from FileMaker and upload it to S3example_filemaker_record_management.py- Shows how to create, update, and delete records in FileMaker
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file arktci_airflow_provider_filemaker-2.4.1.tar.gz.
File metadata
- Download URL: arktci_airflow_provider_filemaker-2.4.1.tar.gz
- Upload date:
- Size: 39.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a5ea63c8e7399b9710d039c1b5a1a4ae425c2c6b590b6ec4663997622ec6dbfa
|
|
| MD5 |
54536281c70e2fb8b3e218db5372cd52
|
|
| BLAKE2b-256 |
144275a0f835ddca577a612ae2e0ad1aa4e971c487eb84d4db32bc82601c286a
|
File details
Details for the file arktci_airflow_provider_filemaker-2.4.1-py3-none-any.whl.
File metadata
- Download URL: arktci_airflow_provider_filemaker-2.4.1-py3-none-any.whl
- Upload date:
- Size: 40.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c4e9d080cb2dbbe3dca011108500b7f526c53a8e02ec078b60e648990039af23
|
|
| MD5 |
f536d38838de5b7a14476b5860aa1b04
|
|
| BLAKE2b-256 |
977799dcff426ecef5d0d0af2e101d9442abc164ef19ffef9b86e9ea27347317
|