MightyData

FileMaker and WordPress Consultants

  • Company
    • About
    • Contact
    • Team
    • Artists
  • Services
    • Consulting
    • Development
    • Our Process
    • Coaching
    • Training
  • Results
    • Customers
    • Success Stories
    • Testimonials
  • Blog

Dealing with Duplicate Records

October 17, 2013 by Anders Monsen 2 Comments

Looping script to flag duplicates

A common issue in all databases is dealing with duplicate records. In FileMaker I tend to use two different methods to identify duplicate records. Each serve their own purpose. One method uses a self-join relationship with a calculation that marks the duplicate records, while the other method uses a script that sort and loops through records.

Three young girls using a laptop

On the surface, both methods are fairly simple, such that when memorized will come to your aid time and time again. If you maintain a database template or a technique file, you can set up both methods and refer to these as needed.

Self-Join

The first method, removing duplicates via a self-join, is drawn directly from FileMaker’s Help manual. It is a technique that has existed for several years, simple and elegant. This method requires two additional fields in your table (or three, if you do not yet have a field with your “unique criteria”). Once you determine which field makes your record unique (which may require a calculation; the example uses name + phone number), you create a self-join relationship. You do this by creating a new table occurrence. I usually call this “MyTableName__Self” (with MyTableName switched to the actual table with duplicate records, of course). Make the relationship key the UniqueField = UniqueField.

With the self join in place, create your two new fields, the Counter and the DuplicateFlag (or whatever you want to call them). Set aside the counter field for now. Make your DuplicateFlag field a calculation. This is a simple IF statement:

If( MyTable::Counter = MyTable__Self::Counter ; "Unique" ; "Duplicate" )

Make the Counter “Auto-Enter Serial”. Then, exit the Manage Database window, and in your layout select the Counter field. With all fields showing, add a serial key by clicking in the field and selecting Records>>Replace Field Contents from your menu. Select Replace with Serial Numbers. All fields should populate, and the DuplicateFlag immediately will update. You then can search for any duplicates. Records update live, so to find new duplicates just run a search.

Looping

The second method loops through the found set and checks each record against the previous record. There are more steps involved, but the benefit is that you control what you want to check. This is useful for reports or displaying certain data to a user by finding or omitting records as needed. This method also does not need any schema changes, like the self-join.

Looping script to flag duplicates

Your script will sort the records based on the unique identifier, in this case a phone number. You set a variable to the phone number, then go to the next records in a loop. For each record, you compare the variable to the current record’s phone number. If they match, you set your flag and/or omit the record. If they don’t match, you update the variable and go to the next record. The loop continues until all records in the found set are checked. In this example, I only showed the duplicate record(s) to make sure the flag works. Usually you keep the duplicate omitted.

Conclusion

Both these methods are quick and simple techniques. There is nothing fancy involved, although the concept of “self-join” may sound obscure to beginners. Either will work to clean up duplicate data in a flash.

Filed Under: Scripting Tagged With: Data conversion

The GoZync 4 Migration Experience

September 26, 2013 by Anders Monsen Leave a Comment

ZyncMigration

I have developed a few FileMaker Go solutions which use SeedCode’s GoZync tool to sync mobile and server data. One mobile solution required a dozen tables, which GoZync 3 handled fairly well, but took a few minutes to sync. When SeedCode released GoZync 4, they boasted a significant speed increase. Naturally, that made this particular solution a prime candidate to update. Aside from speed changes, GoZync 4 also eliminated dedicated syncing layouts on the mobile and hosted files, moving these to the GoZyncMobile file instead. This made for cleaner customer files. However, migrating a solution built around the architecture of v3 to a brand new architecture seemed daunting.

The SeedCode crew informed me that migration was not that complicated, and an hour or less might be all that was needed to migrate from v3 to v4. While that might be case for smaller mobile files, reconfiguring my mobile file for a dozen or so tables took a little longer. Overall, the process went smoothly, as advertised. Rather than a step by step walk through of migration, which SeedCode already has created, here are some thoughts on the process.

Having a backup of all files is crucial. Don’t just back up the mobile file, but also your hosted file (or mothership files, as SeedCode calls them). Once you’ve completed your migration, you can clean out your old syncing layouts in both files. With backups you can revert to good copies if anything goes wrong.

Migration is almost like a brand new integration. With GoZync you deal with at least five (5) files:

  • GoZyncHosted (GZH)
  • GoZyncLicence
  • GoZyncMobile (GZM)
  • Your hosted file
  • Your mobile file

However, all connections to the mobile and hosted files are made in GoZyncMobile, and integration then takes place in GoZyncHosted as before. In GZM, setting up the connections in the Relationships Graphs is simple. With the GoZync TO in the middle, you add hosted files on one side, and mobile files on the other. (See diagram 1.)

Relationships to mobile and hosted files

Diagram 1: Relationships to mobile and hosted files

Next, you create your syncing layouts. Follow SeedCode’s format and group your syncing layouts in mobile and hosted folders. (See Diagram 2.) Then you can bring up multiple windows side by side and check that the fields are present for syncing. No field, no data sync. Caveat: calculation fields are tricky, but visually I didn’t get feedback about a calculation field on the layout, only when syncing. This might be due to the ability now to sync a calculation field to a non-calculation field, which is a nice bonus.

Take advantage of hosted and mobile layout folders

Diagram 2: Layout folders for mobile and hosted layouts

Once your mobile file is set up, the next item is to configure your hosted file. The interface has changed since  v3. You still sync each table, but the process makes it easy to get an overview of all tables being synced. You also can set certain tables to either push, pull, or both. (See Diagram 3.) Some of my tables were for reference only, and so I ended up pulling down half the tables, and both pushing and pulling the other half.

Diagram 3: Interface to sync each table

Diagram 3: Interface to sync each table

Then, it’s just a matter of replacing the scripts in your mobile file with the new scripts from the sample mobile file, hooking up the buttons, and syncing to test. I found that GoZync 4 is significantly faster than v3.

Filed Under: Scripting Tagged With: Data conversion, FileMaker Go, GoZync

Exporting Cross Platform vCards From FileMaker

September 19, 2013 by Darren Burgess 6 Comments

vCard is a standard file format for distributing contact information from applications such as Mac’s Address Book and Microsoft Outlook. These applications have the ability to export contact records into a vCard file, or import an existing vCard. We see them all the time attached to emails.

Naturally, as FileMaker developers we may consider how we can leverage this data sharing tool in our FileMaker applications. If you want to create your own vCard export, then you can head over to Wikipedia and check out the vCard 3.0 specification and use that to build your calculations and scripts to format your contact data into the required format. Store that in a field and export the field contents and you are done. You can even use FileMaker to pop it in a new email. Easy.

One Step Forward

Now, any self-respecting experienced developer is also lazy, and in the process of solving this problem I first searched to see what else had been done. Gotta get it done fast, right? So a quick search found a simple vCard export solution, by Chris Pye of MacTec. There is a sample file there as well.

You would think I was just about done now, with the only steps remaining being to integrate Chris’ scripts into my customer solution. (Thanks for sharing, Chris!) Unfortunately there is one ginormous caveat, and that is the resulting vCard, when generated from a Mac, is not compatible with Windows. And my customer works entirely on Macs but requires that vCards are cross-platform. What happens when you try to import a Mac generated vCard into Windows Outlook is… nothing. Blank fields.

It so happens you can actually open the vCard file (it is just text) in WordPad, then re-save it, and it will then open in Outlook, however I can’t really ask my customer to do that. That WordPad trick did provide me a clue however.

File Encoding Is Crucial

Turns out that FileMaker’s Export Field Contents script step defaults to UTF-8 character encoding (hey – someone fact check that for me). That format is not really relevant, other than I thought this might be why it was not working in Windows. Next I thought I would check out the free Base Elements plugin (thanks, Nick!) to see if it had a function that would do what I needed. Luck was on my side, as there are two functions that will help, BE_SetTextEncoding and BE_WriteTextToFile.

After the Export vCard script deals with constructing the vCard data in the variable $vCardData, here is what the script steps look like:

# Set the FilePath
Set Variable [$FilePath; Value:Get ( DesktopPath ) & "vCard.vcf"]
# Set text encoding to ANSI
Set Variable [$var; Value:BE_SetTextEncoding ( "ANSI_X3.4-1986")
# Create a file using the data stored in the local variable.  
# This step replaces the usual Export Field Contents script step.
Set Variable [$var; Value;BE_WriteTextToFile ( $FilePath ; $vCardData )]

Just When I Thought It Was Done

But we are not. The file generated will not open in Microsoft Outlook. Bummer. Not one to give up, I tried opening the vCard in Windows NotePad and found that there were no carriage returns, just a string of data mashed together. Clearly, Windows did not like the FileMaker “¶” character that was used to build the vCard data.

Not all paragraph returns are the same – check out the table of Unicode characters documented in FileMakers Code() function. Code(10) is a LineFeed. Code(13) is a Carriage Return (remember typewriters?!? A carriage return is a linguistic artifact from this ancient invention). Next step, and the one that finally solved the issue, was to substitute all of the FileMaker “¶” characters with LineFeed characters using the Char() function.  Something like this:

Set Variable [$vCardData; Value:Substitute( $vCardData; "¶"; Char (10) )]

And now the vCard, generated from a Mac resident FileMaker database, is automagically cross-platform. Maybe next time we will work on vCard importing.  See you then!

Filed Under: Scripting Tagged With: Integration, Plugin, vCard

Using a Web Service to Generate Barcodes

September 4, 2013 by Anders Monsen 4 Comments

Recently I came across a web service over at ProgammableWeb that generates barcodes. This might be useful if you scan barcodes, and want an alternate method to create barcodes in your FileMaker solution.

The web service, Barcodes4.me, is a RESTful service and requires no API or fee. It supports a handful of barcodes types, as well as QR codes, and requires a simple URL call. Using FileMaker 12’s Insert from URL you can send a query to this web services and insert a barcode image into a container field.

Given that this is a free service, there is no guarantee how long it will remain available or a free service. The types of barcodes available also are limited. This is still a good option to keep in mind when developing and testing a mobile solution with barcodes. I have included a small demo file to show this in action.

WebServicesDEMO.fmp12

Filed Under: Barcodes Tagged With: Demo file, FileMaker Go, Insert From URL, Web service

Creating Your First iPad Application: Layout Design

August 9, 2013 by Darren Burgess Leave a Comment

Layout design for portrait and landscape

I am deep in the middle of developing an iPad FileMaker Go application for an entertainment labor management company. Thought this would be a great time to share some of what I have learned about designing a mobile database solution.

Layout Design

In designing my iPad layouts I wanted the user to never have to scroll a form view layout. In other words, the layout would fit perfectly both in landscape and portrait mode on the device. This was accomplished by setting the width of the layout to fit the device width while in portrait mode and height of the layout to fit the device height while in landscape mode.

Layout design for portrait and landscape

So, I set the right margin to 768 and the total layout height (header + body + footer) to 686. In the case of this application, I am hiding and locking the status toolbar that shows at the bottom of the iPad screen, giving me a bit more room for screen display. (To force the Inspector to display the layout width and right margin location, click anywhere on blank layout space.)

In hindsight, I would not have designed this layout with a header and footer. They are not really serving a purpose with respect to functionality or design, other than that I can apply gradient fill independently in those layout parts.

Tab Panels

My next goal was to provide additional data display and functionality while the user was in landscape mode. I originally learned about this technique at the 2012 DevCon. To accomplish this, I placed the additional elements in the right margin, with the left edge of these elements at 768. The width of these elements is 256, bringing the total width to 1024, a perfect fit in landscape. The effect of this on the iPad display is that the elements in the margin will hide in portrait and display in landscape. Here are a few things to keep in mind.

  • Field in Right Margin – A field in the right margin will display in landscape mode, but cannot be interacted with unless it is touching the right margin. You can’t use the Go To Object or Go To Field script steps to get there either; it generates an error.
  • Grouped Objects – If objects are grouped, and the grouped object is touching the right margin, only the objects in the group that are touching the margin will display. See the screen shot below. The upper right icons (search, create, view) were originally a grouping of native FileMaker objects. What I found was that only the rectangle object would display in landscape; the icons would not appear. The simple solution to this was to take a screen shot of grouped objects and use the resulting PNG graphic file instead.
    Tab panels for iPad landscape mode
  • Tab Panels – The right margin area is made up of nested tab panels, allowing for additional data display and functionality. The main tab panel has 3 tabs, with the width of each tab set to 86. This puts the tab panels behind the icons in the graphic. I couldn’t use buttons here – remember they will not display unless they are touching the right margin. So I used a tab panel instead to simulate the buttons and provide the user with a visual indication (the small triangles) to indicate where they are. I also provided a second tab panel in the middle main tab panel that has 3 tabs, each providing a different kind of related record creation user interface.

And here is another hint to consider when overlaying objects on a tab panel, if you don’t want the object “stuck” to the panel: place the object first, then slide the panel to the object. This results in the object being independent of the panel. Don’t move that object though without first moving the tab panel to the side. Move the object, then the put the tab panel back. If you move the object first, it will get stuck to the panel.

So there you have it. With a bit of patience and trickery, you can create FileMaker Go mobile database applications that mimic some of the user interface behaviors that iOS users have come to expect. Stay tuned for Part 2 where I continue with some of my findings related to iPad field behavior, mobile database syncing, and more.

Filed Under: Layout Design Tagged With: FileMaker Go, Layout objects

Using SQL to Duplicate a Record Set

July 26, 2013 by Anders Monsen Leave a Comment

Duplicating a single record in FileMaker is simple. However, what if you are faced with duplicating not just a record, but all related child records? I wanted to accomplish this without modifying the graph, and without looping through records to build my data sets.

For my customer’s development project, I had a table of tickets with associated time, material, and other related records where sometimes only minor changes to a big ticket were needed. Rather than create the related records from scratch each time, we needed a fast and simple method to duplicate key elements from each related table, and at the same time retaining the relation information from the newly created ticket or parent table.

I chose to use FileMaker Pro 12’s native SQL to build arrays with field/value pairs. Unlike a language like PHP, which has a built in logic for handling arrays, in FileMaker these have to be constructed using certain text-character separators, and therefore are not iron-clad arrays. However, once I constructed the, um, pseudo-arrays I could loop through the array values and set my new records.

Although it’s quite possible to set up an interface where you list all fields in related tables that need to be duplicated, I chose to set my field names within the SQL statement. Any changes to fields then need to be made inside the SQL statement, although with a little more programming my solution could conceivably adopt the field storage option. For the sake of this example, I elected to keep the syntax in the SQL statement fairly simple (no custom functions, although the GetFieldName function accounts for any Table Occurences name changes). If field names change or you want to add more fields, the SQL needs to be updated. Also, in regards to your own fields, you need to be aware of any reserved names which will cause the SQL to fail.

Download example file Tickets.fmp12.

The Related Table Query

To begin with, I set a variable with the original ticket number so I could message the user after the script finished. Then I used SQL to select the fields for each related table, and set my array. The example from the time table shows this format. I concatenated the field name and the value from the field. For number, date, and time fields I found that I needed to use the double pipe concatenation symbol instead of the plus sign; using the plus sign for these non-text field types caused some strange behaviors, usually stripping out the value.

In order to later parse out the values from my pseudo-array I added “^” as the field separator. I picked this character I didn’t anticipate it to exist in any field, unlike commas. You may notice the row separator uses a tilde or ~ symbol, instead of a pilcrow or return symbol which is the default separator. The reason for this is explained below in the script parameter section for the subscript the creates the new records.

Let([
query = "
SELECT
'" & GetFieldName ( TIME_ENTRY::NameFirst ) & "|'+namefirst,
'" & GetFieldName ( TIME_ENTRY::NameLast ) & "|'+namelast,
'" & GetFieldName ( TIME_ENTRY::TimeBegin ) & "|'||timebegin,
'" & GetFieldName ( TIME_ENTRY::TimeEnd ) & "|'||timeend
FROM time_entry
WHERE id_ticket = ?";
result = ExecuteSQL( query ; "^" ; "~" ; TICKET::ID )
];
result
)

With my result sets in hand, I duplicated the ticket with the “Duplicate Record/Request” script step. Since I used Get (UUID ) for my primary table ID I had to uncheck the “Do not replace existing value of field (if any)” check box, otherwise the UUID would also get duplicated.

Alternate Duplicate Record Option

If you do not use the “Duplicate Record/Request” you can perform the same action using New Record/Request. First, you would need another SQL statement, grabbing only select items from the record to be duplicated.

Let([
query = "
SELECT
'" & GetFieldName ( TICKET::TicketDescription ) & "|'+ticketdescription,
'" & GetFieldName ( TICKET::TicketName ) & "|'+ticketname,
'" & GetFieldName ( TICKET::ID_Customer ) & "|'+id_customer
FROM ticket
WHERE id = ?";

result = ExecuteSQL( query ; "¶" ; "" ; TICKET::ID )
];
result
)

In order to set the fields you loop through the SQL result, adding pipe separators between the fields (or another separator that later can be switched). With my field/value pair I could use the “Set Field by Name” script step.

Set Variable [$itemCount; Value: ValueCount( $ticketResult )]
Loop
  Exit Loop If [Let ( $i = $i + 1 ; If ( $i > $itemCount ; Let ( $i = "" ; True ) ) ) ]
  Set Variable [$row; Value: GetValue( $ticketResult ; $i )]
  Set Variable [$field; Value: GetValue( Substitute( $row ; "|" ; "¶" ) ; 1 ) ]
  Set Variable [$row; Value: GetValue( Substitute( $row ; "|" ; "¶" ) ; 2 )]
  Set Field By Name[ $field; $value]
End Loop
Commit Records/Requests []

After Parent Record duplicated

Next, I set variables with my new ID and ticket number. The new ID would be used in the subsequent related records. I called a subscript for each related table, passing in the ticket ID, the layout of the related table, and the SQL result for that related table’s values, which is spelled out in the example file. In the subscript I used a return delimited set of values for the script parameters. To address the point raised above with the tilde separator, if I had used the default return delimiter between rows in my ExecuteSQL my script parameter method would strip out everything but the first row or record. To prevent this I used the tilde, and then in the subscript added back the pilcrow/return character as the row separator. To validate the values in my Data Viewer as I tested the scripts I have this in two actions.

Set Variable [$SQL_Lines; Value: GetValue( Get( ScriptParameter ); 3)]
Set Variable [$SQL_Result; Value: Substitute( $SQL_Lines ; "~" ; "¶" )]

If you use a different method of passing multiple parameters, you just adapt the SQL and subscript to handle this. My goal with the example file is to keep it simple with no custom functions.

The subscript, “Duplicate_RelatedTable”, goes to the relevant layout, then uses a nested loop to create the record and set the values. Since the ticket’s time entry portal might contain multiple values, I needed to create a new record for each one, and set the relevant values from the previous parent record. Since I used the same Foreign key naming convention this let me use Set Field by Name for the Ticket ID in the related tables.

Go to Layout [$Layout]
Set Variable [$itemCount; Value: ValueCount( $SQL_Result )]
Loop
  Exit Loop If [Let ( $i = $i + 1 ; If ( $i > $itemCount ; Let ( $i = "" ; True ) ) ) ]
  Set Variable [$row; Value: GetValue( $ SQL_Result; $i )]
  New Record/Request
  Set Variable [$ForeignID; Get ( LayoutTableName ) & "::ID_Ticket" ]
  Set Field By Name[$ForeignID; $TicketID // from script parameter ]
  Set Variable[ $record; Value: Substitute( $row ; "^" ; "¶" ) ]
  Loop
    Exit Loop If [Let ( $j = $j + 1 ; If ( $j > ValueCount( $record ) ; Let ( $j = "" ; True ) ) )]
    Set Variable[ $data; GetValue( $record ; $j ) ]
    Set Variable [$field; Value: GetValue( Substitute( $data ; "|" ; "¶" ) ; 1 ) ]
    Set Variable [$row; Value: GetValue( Substitute( $data ; "|" ; "¶" ) ; 2 )]
    Set Field By Name[$field; $value]
  End Loop
Commit Records/Requests []
End Loop

The counter is handled within the Exit Loop condition. As there are multiple loops I wanted to make sure I cleared out all the values of the counter variable after each section within the loop completed.

With all this in place, the user only needs to click on the “duplicate record” button, and the script runs. Once done they see a message as to which ticket number was duplicated and the new ticket number. Success!

Filed Under: Calculations Tagged With: Data conversion, Demo file, ExecuteSQL, FileMaker 12

Server-Side Scripting Improves Performance by 99%

June 20, 2013 by Darren Burgess 7 Comments

Target

MightyData developed an order management system for Lettermen Sports several years ago.  Lettermen Sports is one of the premier sports apparel production companies in Minnesota with locations in Blaine and Plymouth and a large production facility where it completes nearly 300 orders per week for sports apparel screen printing, lettering and shipping.

The primary purpose of its FileMaker application is to manage the flow of orders through its production facility. A core facet of the solution is its interface with the SQL-based point-of-sale program, CounterPoint. The FileMaker solution was developed by MightyData to import orders and order detail from CounterPoint.

Performance of the import/update process began to lag behind the increasingly fast pace of the business as the record counts in some of the FileMaker tables grew to over 300,000. With the update process taking an average of 23 minutes, the in-house FileMaker server performance was being compromised by the load. Employee productivity was reduced by having to wait for orders to be imported, and by the performance hit to the FileMaker server.

Lettermen Sports’ management approached MightyData to re-design the order import process to accomplish the following:

  • Reduce the time required to import and update orders from CounterPoint to the FileMaker order management system
  • Automate the import process
  • Eliminate the up-to-24-hour delay for new and updated CounterPoint orders to be created/updated in the FileMaker database

Technology

MightyData’s technical team arrived at a solution to redesign the import/update process to more closely match the current needs of the business. The original import/update process involved importing data from the point-of-sale solution using FileMaker’s import script step to update matching records and create new records that were unmatched. Import subscripts were then used to import data from across the CounterPoint solution so that data required for shop floor production was available in the FileMaker database. This approach was very effective at ensuring that all data was successfully imported as the solution would import across all orders and line item detail each time the update was run.

As the business rules evolved, however, the update process became more complex. Combined with high record counts, a new approach was required.

The new design was based on only working with new and changed records in the CounterPoint system. In so doing, the number of order records that the scripts were required to process was reduced dramatically, to just 3 or 4 at a time. Conveniently enough, CounterPoint provides an order creation and update timestamp which is updated any time an order or its line-item detail is changed. This allowed for a complete overhaul of the scripting process that simultaneously accomplished all three of the customer’s goals. Furthermore, all script execution was moved to the server, allowing staff to focus on order production and not on managing the update process.

Transformation

This project is an excellent an example of how value is driven by the interface of a customer’s need with great technical planning and execution in a spirit of partnership with the customer to get the job done well. The new solution reduced the update execution from an average of 23.4 minutes to just 2.5 seconds – a 99.8% performance gain. While technically impressive, the real value comes from production staff getting new orders and updates no longer than 30 minutes after they are recorded in the point-of-sale system. If even that is too long, the update can be run manually in a few seconds, giving the team immediate access to new order data.

As an added bonus, MightyData also gets to work with some really good people at Lettermen Sports. They help us help them. And that partnership makes all the difference.

Filed Under: Scripting Tagged With: Case study, Performance, Server-side scripting

Hooking Formstack into FileMaker

June 13, 2013 by Anders Monsen 10 Comments

Form builder screen in Formstack

There are generally three ways to get web-based form data into FileMaker.

  1. You can set up Instant Web Publishing using FileMaker Server Advanced, exposing a FileMaker layout directly to the web.
  2. You can write a custom HTML form and submit the data using FileMaker’s PHP API.
  3. You can use a third-party interface that works with FileMaker’s PHP API.

Using IWP seems like it requires little or no effort. However, aside from the additional expense of Server Advanced, certain programming constraints still exist. Accounting for multiple layouts or conditional fields that appear based on answers can complicate your solution quickly. A custom form requires a basic knowledge of HTML and associated web technologies (CSS, Javascript, PHP, etc.), but again runs into issues if you want AJAX-like feedback and actions.

Using a third-party option like Formstack gives you the flexibility of custom HTML technologies and offloads much of the design work. Still, design is only half the picture, as you need to integrate something like Formstack into FileMaker.

Getting Started With Formstack

The starting page for any developers is Formstack’s Developer Central page. You will need a Formstack account, and there is an associated cost with this method though you can try it for free first. For our purposes we’re going to use the Webhooks method instead of the REST API. The webhooks method sends data in a key/value format to an external URL upon the form submit action. That external URL is your PHP page, where you create the code to parse the submitted data and push it into FileMaker.

Form builder screen in Formstack

Your FileMaker file is where data will end up after the form is submitted. This will need to be hosted, and set up for fmphp access. The web interface layout(s) need only the fields that will receive data from the form. This gives you a couple of options: you can use the data separation model to them hook into this database and display your data or process it for any analytics; or, you can use the same database and build your analytics process as part the solution.

Setting Up the Webhook

Once you have built your form, you open the Settings and decide where to point the form for each submission. Note the two actions in the “After the Form is Submitted” section. This is where you point to your page that processes the data, and redirect the user after the data is processed.

Redirect settings for webhook in Formstack

To process the form data, we need FileMaker’s PHP API files, Formstack’s own class (Formstack.php) and an API key from Formstack. The Formstack items are found in Formstack’s Dashboard, in the Forms tab, in the Social Media section.

You need to write at least one file, with an optional second file. The first will process the data behind the scenes. In these instances, I have simply called it “webhook.php”. I created a second page, where the form will redirect after being processed. This is a simple “Thank you” page. You can customize the Thank You to display any text you like, and include a link back to the survey.

The Webhook Code

The webhook.php files contain PHP code to handle the data.

/* FORMSTACK SECTION */
require_once('Formstack.php');
define('API_KEY' , 'YOUR_FORMSTACK_API_KEY_GOES_HERE' );

// variable to skip extract function
$skipExtract = $_GET['skipExtract'];

$formstack = new Formstack(API_KEY); //instantiate the class

$form = $formstack->form($_POST['FormID']); //internal form ID
$form_name = $form['name']; // capture name

$data = array('form_name' -> $form_name );
foreach( $form['fields'] as $field ) {
  $id = $field['id'];
  $name = $field['name'];
  $value = $_POST["$id"];

  if($skipExtract) {
   $extract = $value;
  } else {
   $extract = fs_extract($value);
  }

  // check for sub values
  if($value == $extract) {
   $data["$id"] = "$value";
  } else {
   foreach( $extract as $subkey => $subvalue) {
    $data["$subkey"] = "$subvalue";
   }
  }
}

/* FILEMAKER SECTION */
require_once('FileMaker.php');
$fm = new FileMaker('MySurveys','hostaddress','user','pwd');
$action = $fm->newAddCommand('web_layout');

// now set the fields
$action->setField('FirstField', $data['fieldid']); // see below for the number
// etc for additional fields

$result = $action->execute();

Note that in this case for the setField action you need to know the field ID issued from Formstack. To find the Field ID, you need to go to Publish >> Advanced, copy the contents of the “Form HTML” field into a text editing program, and find the field(s) being submitted. There you will see something like id=”field1234567″ name=”field12345567″ associated with each field. You need only the number. Return to your webhook.php page and fill in the fielded associated with each field you want to map for the FileMaker database. With Formstack you also can set up notifications, view stats, and more.

Job Application Form

For an example of how the front end works in action, check out our job application form. This form is created in Formstack, displayed on the web and then imports into a FileMaker database to parse the data and keep future notes.

While there might be slightly more effort in setting this up than building your own form, Formstack allows for an easy customization of forms and does most of that work for you. The required code to hook your forms into FileMaker is fairly minimal, yet the rewards save time and effort in the long run.

Filed Under: Web Development Tagged With: Formstack, Integration, PHP, Web service, Webhook

Propa-Gator: Modular Record Creation

May 21, 2013 by Darren Burgess 4 Comments

Creating records from global fields

A few weeks back I posted my first contribution to Todd Geist’s new collaborative effort to create a library of shared FileMaker modules. This script was originally inspired by the need to create reusable code for my customer projects at MightyData. My goal with the Propa-Gator module was to increase my coding efficiency by creating a portable script that would allow the creation of records and setting of any number of fields in the new record with any desired value.

In addition, I wanted the script to be able to use global field inputs as the field values in the new record and to nullify those globals on completion. Finally, I wanted to ensure that the script was completely context free and immune to field renaming.

The result was a script that I found to be a very useful in a variety of contexts:

  1. Related records can be created such as log records, phone numbers and attributes.
  2. Flat records can be normalized.  For example, if you have a table with two fields representing phone number 1 and phone number 2, you can loop through the found set of records and call the CreateRecord script twice, creating related phone number records in a phoneNumber table. I found this to be extremely useful when importing excel spreadsheets supplied by customers.
  3. Create record from global fields. Below is a screen shot from the examples provided with the Propa-Gator file.  In this example, the user can enter values in two global fields. The second field uses an OnObjectExit trigger to launch the CreateRecord script with the globals used as data inputs. The script has the smarts to identify that the inputs were from global fields and will nullify those globals.

Creating records from global fields

Scripted Find Requests

A common FileMaker technique is to build find requests using something like the following sequence of script steps:

Enter Find Mode []
Set Field [someTable::DateField1 ; ">=01/01/2010"]
New Records/Request
Set Field [someTable::StatusField1 ; "Complete"]

The CreateRecord script can be used in this case to create the find requests, as in example 3 that is included with Propa-Gator.

Enter Find Mode []
# Creates and sets values for first find request
Perform Script [CreateRecord] 
# Creates and set values for second find request
Perform Script [CreateRecord] 
# Delete the first request (it is blank)
Go to Record/Request/Page [First]
Delete Record/Request [No dialog]

Implementing CreateRecord Script

Implementing the CreateRecord script in your solutions is as easy as copying four scripts, three of which are for script trigger suppression and are not required. After you copy the scripts, take a look at the examples to explore ways of implementing the CreateRecord script in your solution. The most difficult part of this process will be properly implementing the script parameter that is passed. While the parameter syntax is clearly documented in the comments of the CreateRecord script, here is the example parameter with a bit of explanation. The parameter, when parsed in the script, creates three local variables: $CR_LayoutName, $CR_TargetFields, and $CR_Data.

"$CR_LayoutName = " & Quote ("DEV_Child")  & ";¶" & 
"$CR_TargetFields= " & 
Quote (
  List ( 
    GetFieldName ( Child::id_Parent ) ;
    GetFieldName ( Child::ChildData1 ) ;
    GetFieldName ( Child::ChildData2 )
  )
) & ";¶" & 

"$CR_Data =  " & 
Quote (
  List ( 
    Parent::id ;
    "SomeData" ;
    Get (CurrentDate) 
  )
)

The first line of the parameter establishes the layout name. When the script parses the script parameter, it will create a local variable $CR_LayoutName and set it to the value specified. The script will navigate to this layout and create the new record in the context of that layout.

"$CR_LayoutName = " & Quote("DEV_Child")  & ";¶" &

The next variable in the parameter, $CR_TargetFields, is a list of the fields in the new record that will be set by the script. Note that the list is wrapped in quotes by the Quote() function and each field name is wrapped by the GetFieldName () function. Use of GetFieldName () is critical to ensuring that changes to field names do not break the script parameter. In this example, the script will create the record and set the three fields in $CR_TargetFields list to the values specified in $CR_Data.

"$CR_TargetFields= " & 
Quote ( 
  List ( 
    GetFieldName ( Child::id_Parent ) ;
    GetFieldName ( Child::ChildData1 ) ;
    GetFieldName ( Child::ChildData2 )
  )
)

The final variable in the parameter, $CR_Data, is a list of data values. In this case, the three fields will each be set to the data value that occupies the same place in the list. Child::id_Parent will be set to the value of Parent::id, Child::ChildDate1 will be set to “SomeData” and Child::ChildData2 will be set to today’s date:

"$CR_Data =  " & 
Quote ( 
  List ( 
    Parent::id ;
    "SomeData" ;
    Get (CurrentDate)
  )
)

There are a few tricks to keep in mind regarding the $CR_Data variable. First, the data values can be any field accessible from the original context of the script (in this case the Parent record), or any literal or calculated value. If the source of the data value is a global input field as in the screen shot above, then the global field name should be wrapped with GetFieldName (). Doing so tells the script that source data came from a global field, and that the global field should be nullified as the script executes. Here is what that should look like in the parameter:

"$CR_Data =  " & 
Quote ( 
  List ( 
    Parent::id ;
    GetFieldName ( Globals::GlobalInput1 ) ;
    GetFieldName ( Globals::GlobalInput2 )
  )
)

The script can also handle data field values that have paragraph returns. For example, you may want to create a record and insert a list of values in a field. In this case, you need to substitute the sequence of characters “~CR~” to represent the paragraph return. The script then substitutes a real paragraph return when it encounters the string in a data value. In this example the third field will be set to a return-delimited list containing the values “FirstLine” and “SecondLine”:

"$CR_Data =  " & 
Quote ( 
  List ( 
    Parent::id ;
    "SomeData in single line" ;
    "FirstLine" & "~CR~" & "SecondLine"  )
  )
)

I should also mention that the script traps for a few errors. While most potential errors can be addressed by correctly passing the parameter of the script, I thought it would be prudent to trap for certain errors and return the error in the Exit Script[] step. In case of any trapped error, the script will exit and the new record will not be created. Errors from the following script steps are trapped:

  • Go to Layout[]: Generally caused by passing an invalid layout name, or perhaps security privileges that don’t allow access to the specified layout. (This is the only part of the CreateRecord script that is not immune to renaming).
  • New Record step:  An error may occur here when security privileges prevent new record creation.
  • Set Field By Name: Again, an error here could be caused by security privileges or perhaps a field name that does not exist in the context of the created record.

I hope you enjoy using the CreateRecord script in your solutions.  Since the work over at www.modularfilemaker.org is open source by nature, I invite you to pick apart and make suggestions in the comments below for improving the script. Or submit a revised file to darren_burgess@mightydata.com.

The latest version of Propa-Gator.fmp12 is available at: www.modularfilemaker.org

Filed Under: Rapid Development Tagged With: Demo file, Development standards, FileMaker module

Dynamic Compound Finds With FileMaker and PHP

May 10, 2013 by Anders Monsen 5 Comments

Stacking stones or find requests

The default behavior for FileMaker’s PHP API find requests is “AND.” Enter multiple criteria in the find request, and FileMaker treats this it as a “Find records where A = ‘this’ and B = ‘that’ in the request. But what happens if you need to search using OR? The PHP API documentation states that you can override the default AND behavior by setting the logical operator:

$find->setLogicalOperator( FILEMAKER_FIND_OR );

Stacking stones or find requests

The Problem

However, each time I’ve tried this, the OR action didn’t seem to work the way I wanted. To make it more interesting, I didn’t know how many requirements would make up my OR search, as the criteria could change with each request. There might be two conditions, or six, or more. All criteria were stored in an array, and I need to loop through this array and add my criteria, but when I ran it through the loop it only remembered the last option, not each option in the array.

foreach( $itemList as $theItem ) { 
   $find->addFindCriterion( 'field', $theItem ) 
}

The documentation for FileMaker’s custom web publishing points to another option, the compound find. This takes separate find requests and stitches them together into one find. However, every example of the compound find showed static criteria, including FileMaker’s own documentation. I needed my criteria to be more fluid.

Page 35 of the documentation shows the compound find goes somewhat like this (shortened here to core elements):

// Create the Compound Find command object 
$compoundFind = $fm->newCompoundFindCommand('Form View');

// Create first find request
$findreq1 = $fm->newFindRequest('Form View');

// Create second find request
$findreq2 = $fm->newFindRequest('Form View');

// Specify search criterion for first find request
$findreq1->addFindCriterion('Quantity in Stock', '<100');

// Specify search criterion for second find request
$findreq2->addFindCriterion('Quantity in Stock', '0');
$findreq2->setOmit(true);

// Execute compound find command
$result = $compoundFind->execute();

My circumstances required that I find a set of child records, extract unique parent IDs from these, and then perform a find on the parent records based on these IDs. However, I didn’t have just two IDs, but an unknown number. In some instances the search might look at two or three records. In others it might be six, or ten.

Mission Accomplished?

First I ran an initial search to give me the parent IDs in the child table. Given a criteria in the child table I pulled up a list of all parent IDs. I looped through this to get one array with the parent IDs.

foreach ( $records as $record){
  $idList[] = $record->getfield('id_parent');
}
$idList = array_unique($idList);
$idList = array_values($idList);

I needed a unique list since there might be duplicated parent IDs in my found set, so I ran the $idList through the PHP function array_unique. I also needed to reset the keys in the array so they would go back to 0,1,2,3,etc instead of 0,3,5,33,etc. which was accomplished with the PHP function, array_values.

Once I had the final and clean list, I performed my search on the parent layout. In the FileMaker documentation example (see above) the “findreq” variable is numbered 1 and 2 (with additional requests taking numbers 3,4,5 and so on). I needed my $findreq_n to grow based on the array count. You can concatenate numbers to PHP variables, but it requires a little extra effort. What this now allowed was a more fluid and less limiting way to build compound finds. When built to match the documentation example, I ended up with two loops to construct the compound find.

// the parent array
$myList = array('id1', 'id2'); // etc

// Create the Compound Find command object
$compoundFind = $fm->newCompoundFindCommand('parent_layout');

$i = 0;
while ($i <= count($myList)):
  ${'findreq' . $i} = $fm->newFindRequest('parent_layout');
  ${'findreq' . $i}->addFindCriterion('id', $myList[$i]);
  $i++;
endwhile;

$j = 0;
while ($j <= count($myList)):
  $compoundFind->add($j,${'findreq' . $j});
  $j++;
endwhile;

// Execute compound find command
$result = $compoundFind->execute();

The use of such “variable variables” is sometimes discouraged, given potential problems. However, in this case the variables worked, whereas previous attempts to get the logical operator OR to work either within a for each loop or without any loop failed.

While this also allows for unlimited OR criteria, for performance issues this isn’t always a good idea. However, if you know the count of the parent array, you can always limit the action based on the count. One choice might be to check if it exceeds a certain number and then re-think the search options.

Let’s Optimize!

The above while loops seems acceptable, but redundant. Furthermore, what if we want multiple criteria in the find? It seems that a cleaner solution is offered by the foreach loop.

$i = 0;
foreach($myList as $myItem ){
  ${'findreq' . $i} = $fm->newFindRequest('parent_layout');

  // add the criteria to the request
  ${'findreq' . $i}->addFindCriterion('Location', 'Some Condition'); // second criteria
  ${'findreq' . $i}->addFindCriterion('id', $myItem); //original ID

  // Add find requests to compound find command	
  $compoundFind->add($i,${'findreq' . $i});
  $i++;
}

This method removes the need for two while loops that separate the addFindCriterion() and add() methods, reads much better and ends up with cleaner code.

Conclusion

Using compound finds provides an alternative to the set logical operator method, and even though there are more lines of code, it appears to work fairly quickly. Hopefully, if you ever have run into the need for compound finds, you might this dynamic method useful, especially as a result of form-submitted data.

Filed Under: Web Development Tagged With: PHP

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • …
  • 14
  • Next Page »

Let’s get started on your project.

It will be more fun than you think.

Get in Touch

  • Company
  • Services
  • Results
  • Blog

Copyright © 2023 · Parallax Pro Theme on Genesis Framework · WordPress · Log in