You can always press Enter⏎ to continue
Is Your FileMaker Solution Ready for AI?
Complete this brief assessment and book your free 15-minute consultation to find out where you stand and what to do next.
START
1
Full Name
*
This field is required.
First Name
Last Name
Previous
Next
Submit
Press
Enter
2
Email Address
*
This field is required.
example@example.com
Previous
Next
Submit
Press
Enter
3
Organization
*
This field is required.
Previous
Next
Submit
Press
Enter
4
Your Role
*
This field is required.
Previous
Next
Submit
Press
Enter
5
What's the #1 thing you're hoping to figure out from this assessment?
Previous
Next
Submit
Press
Enter
6
Which compliance frameworks apply to your organization? Select all that apply.
HIPAA
GDPR
SOX
FERPA
PCI-DSS
None of these
Not sure
Other
Previous
Next
Submit
Press
Enter
7
Where are you with AI in your FileMaker environment?
*
This field is required.
Actively using AI features now
Planning / building -- not live yet
Exploring the idea
Not sure / want to learn more
Previous
Next
Submit
Press
Enter
8
Who is formally responsible for AI-related decisions in your FileMaker environment -- things like which AI features get built, what data gets used, and how outputs are reviewed?
*
This field is required.
No one is formally responsible -- developers or staff make their own calls
One person handles it informally, but there's no defined role or process
A designated role or small team reviews AI implementations before they go live
A cross-functional group with executive involvement reviews all AI use cases, with documented policies and regular check-ins
I'm not sure
Previous
Next
Submit
Press
Enter
9
When you implement AI in your FileMaker environment, who do you envision being responsible for those decisions -- what gets built, what data gets used, how outputs get reviewed?
*
This field is required.
No one is formally responsible -- developers or staff make their own calls
One person handles it informally, but there's no defined role or process
A designated role or small team reviews AI implementations before they go live
A cross-functional group with executive involvement reviews all AI use cases, with documented policies and regular check-ins
I'm not sure
Previous
Next
Submit
Press
Enter
10
Tell us a bit more — even 'nobody, it's ad hoc' is a useful answer.
Tell us a bit more — even 'nobody, it's ad hoc' is a useful answer.
Previous
Next
Submit
Press
Enter
11
Does your organization have a written policy for how AI tools -- including FileMaker's AI script steps -- can be used with company data?
*
This field is required.
No policy exists
There are informal verbal guidelines, but nothing documented
A written policy exists but isn't consistently communicated or enforced
A documented, enforced policy covers all AI integrations and gets reviewed at least once a year
I'm not sure
Previous
Next
Submit
Press
Enter
12
Has your organization started drafting policies for how AI tools and FileMaker's AI capabilities could be used with company data?
*
This field is required.
No policy exists
There are informal verbal guidelines, but nothing documented
A written policy exists but isn't consistently communicated or enforced
A documented, enforced policy covers all AI integrations and gets reviewed at least once a year
I'm not sure
Previous
Next
Submit
Press
Enter
13
How long have you been using FileMaker?
*
This field is required.
Less than 1 year
1-3 years
4-10 years
More than 10 years
Previous
Next
Submit
Press
Enter
14
How does your organization decide whether a new AI use case is appropriate for your FileMaker environment?
*
This field is required.
Individual developers or users decide on their own
The FileMaker developer decides -- there's no formal process
A standard intake process evaluates use cases against defined criteria
A cross-functional review evaluates risk, compliance, business value, and data sensitivity before any AI feature goes live
I'm not sure
Previous
Next
Submit
Press
Enter
15
How are you planning to evaluate whether a new AI use case is appropriate before building it into FileMaker?
*
This field is required.
Individual developers or users decide on their own
The FileMaker developer decides -- there's no formal process
A standard intake process evaluates use cases against defined criteria
A cross-functional review evaluates risk, compliance, business value, and data sensitivity before any AI feature goes live
I'm not sure
Previous
Next
Submit
Press
Enter
16
Can you describe how AI decisions have been made so far, even if there's no formal process?
Can you describe how AI decisions have been made so far, even if there's no formal process?
Previous
Next
Submit
Press
Enter
17
Is sensitive data -- things like protected health information, financial records, or personally identifiable information -- clearly identified and classified in your FileMaker solution?
*
This field is required.
No -- we haven't thought through data classification
Our developer generally knows which tables have sensitive data, but it's not documented
Sensitive data fields are documented and access is restricted by privilege sets
All sensitive data is classified, encrypted, access-controlled, and flagged to prevent it from being included in prompts sent to external AI providers
Previous
Next
Submit
Press
Enter
18
Before adding AI to your FileMaker solution, have you mapped which of your data is sensitive -- things like health records, financial data, or personally identifiable information?
*
This field is required.
No -- we haven't thought through data classification
Our developer generally knows which tables have sensitive data, but it's not documented
Sensitive data fields are documented and access is restricted by privilege sets
All sensitive data is classified, encrypted, access-controlled, and flagged to prevent it from being included in prompts sent to external AI providers
Previous
Next
Submit
Press
Enter
19
How would you describe the quality and consistency of data in your FileMaker databases?
*
This field is required.
Data quality is unknown -- we've never assessed it
We know there are issues (duplicates, missing fields, inconsistent formats) but haven't addressed them systematically
We've conducted data quality audits and have cleanup processes in place
Automated validation rules enforce data quality at entry, with regular audits and documented data standards
I'm not sure
Previous
Next
Submit
Press
Enter
20
Do you know what data actually gets sent to your AI provider when your FileMaker AI script steps run?
*
This field is required.
No -- we haven't looked at this
We know some data goes to the provider's API, but we haven't mapped the specifics
We've documented which script steps send data externally and what types of data are included
Data flows are fully mapped, contractually verified with our provider, and we have controls preventing sensitive data from going out without authorization
I'm not sure
Previous
Next
Submit
Press
Enter
21
Have you thought through what data would leave your FileMaker environment and go to an external AI provider when AI script steps run?
*
This field is required.
No -- we haven't looked at this
We know some data goes to the provider's API, but we haven't mapped the specifics
We've documented which script steps send data externally and what types of data are included
Data flows are fully mapped, contractually verified with our provider, and we have controls preventing sensitive data from going out without authorization
I'm not sure
Previous
Next
Submit
Press
Enter
22
What's your best guess about what gets sent — even if you're not sure?
What's your best guess about what gets sent — even if you're not sure?
Previous
Next
Submit
Press
Enter
23
Have you assessed whether your AI features in FileMaker create new compliance risks under the frameworks that apply to your organization -- HIPAA, GDPR, SOX, or others?
*
This field is required.
We haven't considered this angle
We're aware there could be risks but haven't formally looked at them
We've reviewed our main compliance obligations against our AI usage in an informal way
We've done a formal risk assessment covering all applicable regulations, with documented findings and a plan to address gaps
I'm not sure
Previous
Next
Submit
Press
Enter
24
Have you considered what compliance risks AI integration might introduce to your FileMaker environment, given the regulatory frameworks that apply to your organization?
*
This field is required.
We haven't considered this angle
We're aware there could be risks but haven't formally looked at them
We've reviewed our main compliance obligations against our AI usage in an informal way
We've done a formal risk assessment covering all applicable regulations, with documented findings and a plan to address gaps
I'm not sure
Previous
Next
Submit
Press
Enter
25
If an AI feature in your FileMaker solution produced incorrect output for a week without anyone catching it -- what would happen?
*
This field is required.
We don't have a plan for that scenario -- we haven't thought through it
Someone would probably notice and fix it, but there's no defined process
We have informal review steps for AI output in workflows that matter
We have documented incident response procedures, rollback capability, and escalation paths specifically for AI errors
Previous
Next
Submit
Press
Enter
26
When Generate Response from Model runs in your FileMaker solution, what happens before that output is used -- written to a record, displayed to a user, or acted on?
*
This field is required.
FileMaker's Generate Response from Model script step is FileMaker's script step for sending prompts to an AI provider. Output can write directly to records -- human review doesn't happen automatically.
Output goes directly to records or users with no review step designed in
Users can see the output but aren't specifically trained to verify it before using it
A deliberate review step requires someone to approve AI output before it's saved or acted on
Approval gates, override mechanisms, and user training are built into every AI-assisted workflow, with logging of all human decisions
Previous
Next
Submit
Press
Enter
27
When you design AI features in FileMaker, how are you planning to handle human review of AI-generated output before it gets written to records or acted on?
*
This field is required.
FileMaker's Generate Response from Model script step is FileMaker's script step for sending prompts to an AI provider. Output can write directly to records -- human review doesn't happen automatically.
Output goes directly to records or users with no review step designed in
Users can see the output but aren't specifically trained to verify it before using it
A deliberate review step requires someone to approve AI output before it's saved or acted on
Approval gates, override mechanisms, and user training are built into every AI-assisted workflow, with logging of all human decisions
Previous
Next
Submit
Press
Enter
28
Describe what happens with AI output in your workflows — who sees it and what happens next?
Describe what happens with AI output in your workflows — who sees it and what happens next?
Previous
Next
Submit
Press
Enter
29
Is AI call logging enabled in your FileMaker solutions?
*
This field is required.
FileMaker's Set AI Call Logging script step captures a record of every AI call your solution makes -- what was sent, what came back, and when. It's off by default.
Not aware of this / haven't looked into it
We know it exists but haven't enabled it
It's enabled in some of our solutions
It's enabled across all solutions, logs are stored in a dedicated table, and we review and retain them per our compliance requirements
I'm not sure
Previous
Next
Submit
Press
Enter
30
Were you aware that FileMaker has a built-in AI call logging capability? When you implement AI, do you plan to use it?
*
This field is required.
Not aware of this / haven't looked into it
We know it exists but haven't enabled it
It's enabled in some of our solutions
It's enabled across all solutions, logs are stored in a dedicated table, and we review and retain them per our compliance requirements
I'm not sure
Previous
Next
Submit
Press
Enter
31
If you built an AI feature in FileMaker and it produced incorrect output for a week without anyone catching it -- what's your honest concern about the impact?
*
This field is required.
We don't have a plan for that scenario -- we haven't thought through it
Someone would probably notice and fix it, but there's no defined process
We have informal review steps for AI output in workflows that matter
We have documented incident response procedures, rollback capability, and escalation paths specifically for AI errors
Previous
Next
Submit
Press
Enter
32
How did you select your AI provider for FileMaker, and what did that evaluation include?
*
This field is required.
We're using whatever the developer set up -- usually a default
We chose based on capability or cost, but didn't look at data handling practices
We compared providers on capability, cost, and data handling policies
We ran a formal evaluation covering security certifications, data residency, BAA requirements, SLAs, and exit strategy
I'm not sure
Previous
Next
Submit
Press
Enter
33
How are you planning to evaluate and select an AI provider for your FileMaker integration?
*
This field is required.
We're using whatever the developer set up -- usually a default
We chose based on capability or cost, but didn't look at data handling practices
We compared providers on capability, cost, and data handling policies
We ran a formal evaluation covering security certifications, data residency, BAA requirements, SLAs, and exit strategy
I'm not sure
Previous
Next
Submit
Press
Enter
34
How do you measure whether your AI implementations in FileMaker are actually delivering value?
*
This field is required.
We don't have a way to measure this
We rely on general feedback that things seem useful
We track specific metrics -- time saved, error rates, user satisfaction -- for at least some AI features
Every AI implementation has defined KPIs, is measured against a baseline, and results shape future decisions
I'm not sure
Previous
Next
Submit
Press
Enter
35
Have you defined what success would look like for AI in your FileMaker environment -- what it would need to accomplish to be worth implementing?
*
This field is required.
We don't have a way to measure this
We rely on general feedback that things seem useful
We track specific metrics -- time saved, error rates, user satisfaction -- for at least some AI features
Every AI implementation has defined KPIs, is measured against a baseline, and results shape future decisions
I'm not sure
Previous
Next
Submit
Press
Enter
36
Does your leadership team understand both what your FileMaker AI features can do and where they fall short?
*
This field is required.
Leadership has heard of AI but doesn't have a practical understanding of what it means for our systems
Leadership is interested but relies entirely on the developer's judgment
Leadership has been briefed and participates in decisions about where AI gets used
Leadership actively champions AI initiatives with an informed view of both the opportunities and the risks, and allocates dedicated resources
I'm not sure
Previous
Next
Submit
Press
Enter
37
How informed is your leadership team about what AI can realistically do -- and not do -- in a FileMaker environment?
*
This field is required.
Leadership has heard of AI but doesn't have a practical understanding of what it means for our systems
Leadership is interested but relies entirely on the developer's judgment
Leadership has been briefed and participates in decisions about where AI gets used
Leadership actively champions AI initiatives with an informed view of both the opportunities and the risks, and allocates dedicated resources
I'm not sure
Previous
Next
Submit
Press
Enter
38
What are your top concerns right now? Select all that apply.
*
This field is required.
Sending sensitive data to third-party AI providers
AI output being wrong without anyone catching it
Compliance gaps (HIPAA, GDPR, etc.)
Not knowing where to start
Leadership buy-in
Cost and vendor lock-in
Building something that doesn't actually get used
Other
Previous
Next
Submit
Press
Enter
39
Is there anything about your FileMaker environment or AI situation that doesn't fit neatly into the questions above?
Previous
Next
Submit
Press
Enter
40
Schedule your Free Call to Review the Gap Analysis Report
*
This field is required.
Previous
Next
Submit
Press
Enter
41
Source
Prefilled from URL parameters
Previous
Next
Submit
Press
Enter
42
Terms and Conditions
*
This field is required.
How we handle your data: Your name, email, and organization are stripped from your responses before AI analysis. A Violet Beacon team member reviews the generated report before your call. We don't share your responses or use them to train AI models. Responses are retained for 90 days.
Previous
Next
Submit
Press
Enter
Should be Empty:
Question Label
1
of
42
See All
Go Back
Submit