MightyData

FileMaker and WordPress Consultants

  • Company
    • About
    • Contact
    • Team
    • Artists
  • Services
    • Consulting
    • Development
    • Our Process
    • Coaching
    • Training
  • Results
    • Customers
    • Success Stories
    • Testimonials
  • Blog

Dealing with Duplicate Records

October 17, 2013 by Anders Monsen 2 Comments

Looping script to flag duplicates

A common issue in all databases is dealing with duplicate records. In FileMaker I tend to use two different methods to identify duplicate records. Each serve their own purpose. One method uses a self-join relationship with a calculation that marks the duplicate records, while the other method uses a script that sort and loops through records.

Three young girls using a laptop

On the surface, both methods are fairly simple, such that when memorized will come to your aid time and time again. If you maintain a database template or a technique file, you can set up both methods and refer to these as needed.

Self-Join

The first method, removing duplicates via a self-join, is drawn directly from FileMaker’s Help manual. It is a technique that has existed for several years, simple and elegant. This method requires two additional fields in your table (or three, if you do not yet have a field with your “unique criteria”). Once you determine which field makes your record unique (which may require a calculation; the example uses name + phone number), you create a self-join relationship. You do this by creating a new table occurrence. I usually call this “MyTableName__Self” (with MyTableName switched to the actual table with duplicate records, of course). Make the relationship key the UniqueField = UniqueField.

With the self join in place, create your two new fields, the Counter and the DuplicateFlag (or whatever you want to call them). Set aside the counter field for now. Make your DuplicateFlag field a calculation. This is a simple IF statement:

If( MyTable::Counter = MyTable__Self::Counter ; "Unique" ; "Duplicate" )

Make the Counter “Auto-Enter Serial”. Then, exit the Manage Database window, and in your layout select the Counter field. With all fields showing, add a serial key by clicking in the field and selecting Records>>Replace Field Contents from your menu. Select Replace with Serial Numbers. All fields should populate, and the DuplicateFlag immediately will update. You then can search for any duplicates. Records update live, so to find new duplicates just run a search.

Looping

The second method loops through the found set and checks each record against the previous record. There are more steps involved, but the benefit is that you control what you want to check. This is useful for reports or displaying certain data to a user by finding or omitting records as needed. This method also does not need any schema changes, like the self-join.

Looping script to flag duplicates

Your script will sort the records based on the unique identifier, in this case a phone number. You set a variable to the phone number, then go to the next records in a loop. For each record, you compare the variable to the current record’s phone number. If they match, you set your flag and/or omit the record. If they don’t match, you update the variable and go to the next record. The loop continues until all records in the found set are checked. In this example, I only showed the duplicate record(s) to make sure the flag works. Usually you keep the duplicate omitted.

Conclusion

Both these methods are quick and simple techniques. There is nothing fancy involved, although the concept of “self-join” may sound obscure to beginners. Either will work to clean up duplicate data in a flash.

Filed Under: Scripting Tagged With: Data conversion

The GoZync 4 Migration Experience

September 26, 2013 by Anders Monsen Leave a Comment

ZyncMigration

I have developed a few FileMaker Go solutions which use SeedCode’s GoZync tool to sync mobile and server data. One mobile solution required a dozen tables, which GoZync 3 handled fairly well, but took a few minutes to sync. When SeedCode released GoZync 4, they boasted a significant speed increase. Naturally, that made this particular solution a prime candidate to update. Aside from speed changes, GoZync 4 also eliminated dedicated syncing layouts on the mobile and hosted files, moving these to the GoZyncMobile file instead. This made for cleaner customer files. However, migrating a solution built around the architecture of v3 to a brand new architecture seemed daunting.

The SeedCode crew informed me that migration was not that complicated, and an hour or less might be all that was needed to migrate from v3 to v4. While that might be case for smaller mobile files, reconfiguring my mobile file for a dozen or so tables took a little longer. Overall, the process went smoothly, as advertised. Rather than a step by step walk through of migration, which SeedCode already has created, here are some thoughts on the process.

Having a backup of all files is crucial. Don’t just back up the mobile file, but also your hosted file (or mothership files, as SeedCode calls them). Once you’ve completed your migration, you can clean out your old syncing layouts in both files. With backups you can revert to good copies if anything goes wrong.

Migration is almost like a brand new integration. With GoZync you deal with at least five (5) files:

  • GoZyncHosted (GZH)
  • GoZyncLicence
  • GoZyncMobile (GZM)
  • Your hosted file
  • Your mobile file

However, all connections to the mobile and hosted files are made in GoZyncMobile, and integration then takes place in GoZyncHosted as before. In GZM, setting up the connections in the Relationships Graphs is simple. With the GoZync TO in the middle, you add hosted files on one side, and mobile files on the other. (See diagram 1.)

Relationships to mobile and hosted files

Diagram 1: Relationships to mobile and hosted files

Next, you create your syncing layouts. Follow SeedCode’s format and group your syncing layouts in mobile and hosted folders. (See Diagram 2.) Then you can bring up multiple windows side by side and check that the fields are present for syncing. No field, no data sync. Caveat: calculation fields are tricky, but visually I didn’t get feedback about a calculation field on the layout, only when syncing. This might be due to the ability now to sync a calculation field to a non-calculation field, which is a nice bonus.

Take advantage of hosted and mobile layout folders

Diagram 2: Layout folders for mobile and hosted layouts

Once your mobile file is set up, the next item is to configure your hosted file. The interface has changed since  v3. You still sync each table, but the process makes it easy to get an overview of all tables being synced. You also can set certain tables to either push, pull, or both. (See Diagram 3.) Some of my tables were for reference only, and so I ended up pulling down half the tables, and both pushing and pulling the other half.

Diagram 3: Interface to sync each table

Diagram 3: Interface to sync each table

Then, it’s just a matter of replacing the scripts in your mobile file with the new scripts from the sample mobile file, hooking up the buttons, and syncing to test. I found that GoZync 4 is significantly faster than v3.

Filed Under: Scripting Tagged With: Data conversion, FileMaker Go, GoZync

Exporting Cross Platform vCards From FileMaker

September 19, 2013 by Darren Burgess 6 Comments

vCard is a standard file format for distributing contact information from applications such as Mac’s Address Book and Microsoft Outlook. These applications have the ability to export contact records into a vCard file, or import an existing vCard. We see them all the time attached to emails.

Naturally, as FileMaker developers we may consider how we can leverage this data sharing tool in our FileMaker applications. If you want to create your own vCard export, then you can head over to Wikipedia and check out the vCard 3.0 specification and use that to build your calculations and scripts to format your contact data into the required format. Store that in a field and export the field contents and you are done. You can even use FileMaker to pop it in a new email. Easy.

One Step Forward

Now, any self-respecting experienced developer is also lazy, and in the process of solving this problem I first searched to see what else had been done. Gotta get it done fast, right? So a quick search found a simple vCard export solution, by Chris Pye of MacTec. There is a sample file there as well.

You would think I was just about done now, with the only steps remaining being to integrate Chris’ scripts into my customer solution. (Thanks for sharing, Chris!) Unfortunately there is one ginormous caveat, and that is the resulting vCard, when generated from a Mac, is not compatible with Windows. And my customer works entirely on Macs but requires that vCards are cross-platform. What happens when you try to import a Mac generated vCard into Windows Outlook is… nothing. Blank fields.

It so happens you can actually open the vCard file (it is just text) in WordPad, then re-save it, and it will then open in Outlook, however I can’t really ask my customer to do that. That WordPad trick did provide me a clue however.

File Encoding Is Crucial

Turns out that FileMaker’s Export Field Contents script step defaults to UTF-8 character encoding (hey – someone fact check that for me). That format is not really relevant, other than I thought this might be why it was not working in Windows. Next I thought I would check out the free Base Elements plugin (thanks, Nick!) to see if it had a function that would do what I needed. Luck was on my side, as there are two functions that will help, BE_SetTextEncoding and BE_WriteTextToFile.

After the Export vCard script deals with constructing the vCard data in the variable $vCardData, here is what the script steps look like:

# Set the FilePath
Set Variable [$FilePath; Value:Get ( DesktopPath ) & "vCard.vcf"]
# Set text encoding to ANSI
Set Variable [$var; Value:BE_SetTextEncoding ( "ANSI_X3.4-1986")
# Create a file using the data stored in the local variable.  
# This step replaces the usual Export Field Contents script step.
Set Variable [$var; Value;BE_WriteTextToFile ( $FilePath ; $vCardData )]

Just When I Thought It Was Done

But we are not. The file generated will not open in Microsoft Outlook. Bummer. Not one to give up, I tried opening the vCard in Windows NotePad and found that there were no carriage returns, just a string of data mashed together. Clearly, Windows did not like the FileMaker “¶” character that was used to build the vCard data.

Not all paragraph returns are the same – check out the table of Unicode characters documented in FileMakers Code() function. Code(10) is a LineFeed. Code(13) is a Carriage Return (remember typewriters?!? A carriage return is a linguistic artifact from this ancient invention). Next step, and the one that finally solved the issue, was to substitute all of the FileMaker “¶” characters with LineFeed characters using the Char() function.  Something like this:

Set Variable [$vCardData; Value:Substitute( $vCardData; "¶"; Char (10) )]

And now the vCard, generated from a Mac resident FileMaker database, is automagically cross-platform. Maybe next time we will work on vCard importing.  See you then!

Filed Under: Scripting Tagged With: Integration, Plugin, vCard

Server-Side Scripting Improves Performance by 99%

June 20, 2013 by Darren Burgess 7 Comments

Target

MightyData developed an order management system for Lettermen Sports several years ago.  Lettermen Sports is one of the premier sports apparel production companies in Minnesota with locations in Blaine and Plymouth and a large production facility where it completes nearly 300 orders per week for sports apparel screen printing, lettering and shipping.

The primary purpose of its FileMaker application is to manage the flow of orders through its production facility. A core facet of the solution is its interface with the SQL-based point-of-sale program, CounterPoint. The FileMaker solution was developed by MightyData to import orders and order detail from CounterPoint.

Performance of the import/update process began to lag behind the increasingly fast pace of the business as the record counts in some of the FileMaker tables grew to over 300,000. With the update process taking an average of 23 minutes, the in-house FileMaker server performance was being compromised by the load. Employee productivity was reduced by having to wait for orders to be imported, and by the performance hit to the FileMaker server.

Lettermen Sports’ management approached MightyData to re-design the order import process to accomplish the following:

  • Reduce the time required to import and update orders from CounterPoint to the FileMaker order management system
  • Automate the import process
  • Eliminate the up-to-24-hour delay for new and updated CounterPoint orders to be created/updated in the FileMaker database

Technology

MightyData’s technical team arrived at a solution to redesign the import/update process to more closely match the current needs of the business. The original import/update process involved importing data from the point-of-sale solution using FileMaker’s import script step to update matching records and create new records that were unmatched. Import subscripts were then used to import data from across the CounterPoint solution so that data required for shop floor production was available in the FileMaker database. This approach was very effective at ensuring that all data was successfully imported as the solution would import across all orders and line item detail each time the update was run.

As the business rules evolved, however, the update process became more complex. Combined with high record counts, a new approach was required.

The new design was based on only working with new and changed records in the CounterPoint system. In so doing, the number of order records that the scripts were required to process was reduced dramatically, to just 3 or 4 at a time. Conveniently enough, CounterPoint provides an order creation and update timestamp which is updated any time an order or its line-item detail is changed. This allowed for a complete overhaul of the scripting process that simultaneously accomplished all three of the customer’s goals. Furthermore, all script execution was moved to the server, allowing staff to focus on order production and not on managing the update process.

Transformation

This project is an excellent an example of how value is driven by the interface of a customer’s need with great technical planning and execution in a spirit of partnership with the customer to get the job done well. The new solution reduced the update execution from an average of 23.4 minutes to just 2.5 seconds – a 99.8% performance gain. While technically impressive, the real value comes from production staff getting new orders and updates no longer than 30 minutes after they are recorded in the point-of-sale system. If even that is too long, the update can be run manually in a few seconds, giving the team immediate access to new order data.

As an added bonus, MightyData also gets to work with some really good people at Lettermen Sports. They help us help them. And that partnership makes all the difference.

Filed Under: Scripting Tagged With: Case study, Performance, Server-side scripting

Virtual List in 3 Easy Steps

April 12, 2013 by Anders Monsen 19 Comments

The Virtual List technique has been around for a few years (created by Bruce Robertson ca. 2009). Over time and with changes to FileMaker, implementation of this technique has evolved. Knowing how to set up and use virtual lists will help with creating certain reports, especially if you draw the values from ExecuteSQL statements in FileMaker Pro 12.

Articles (see Lisette Wilson’s Taming the Virtual List, Part I and Part II) and examples are fairly easy to find. So why write another one? Many examples and articles include use-specific cases to illustrate the virtual list. I’m going to attempt to boil this down to three simple steps. Once you draft the basics of this process into a FileMaker file, you can use this as a template for more complex reports. You can re-use the draft as a template, copying the table and script from one solution to another. This technique requires one dedicated reporting table, plus a script to populate some global variables. The fields in the reporting table are set up as calculation fields that get their values from the list within the global variable.

To illustrate the process, I added a table to the Contacts Start Solution as a sample database, which you can download here.

Step 1: The Reporting Table

When you visualize the report itself, it usually consists of rows and columns. Each row is a record in the table, and each column corresponds to a field. At minimum, you need two fields. One field is a hidden id field; it’s a crucial component of the report, but is not displayed. The second field appears on the report, and will contain data generated by the script. This simple two column example below could have been created as a SubSummary report. However, the power of the virtual list is that it allows you to combine data from many tables, something not easily within reach by a SubSummary report. In my demo file I have three fields, one for the id and two for the actual column data (state and count).

Data from contacts table pulled into the virtual reporting table

Step 2: The Fields

Of the two fields mentioned in the table, the id field governs what shows up in the data field or column. This field is set up as a number, starting at one, with each subsequent record incrementing by one. Most reports will have less than 100 rows, so a general rule of thumb is to just create 100 records (however, if your report demands more rows, just create more records). If you create a table view with the report layout, the id field should show 100 rows, with values from 1 through 100.

The second field is the first column of the report. Reports can have multiple columns, so new fields are created as needed, using the same principle. To keep it generic, in my report template I simply named the fields “column1” and “column2” and set them up as text fields. If you need to format data as numeric values, i.e. $43,000.00, this can be done with text formatting functions. The column field is set up as an unstored calculation: GetValue ( $$COLUMN1 ; id ). This will make more sense after Step 3, the script that creates the $$COLUMN1 global variable. Before the $$COLUMN1 variable exists, the “column1” field has no value. Once the variable is created, the field calculates from the content of $$COLUMN1, which is a list, and extracts the value that corresponds to the id number for that row. This is the magical part of the virtual list technique.

Assume that $$COLUMN1 contains these values:

  • Banana
  • Apple
  • Orange
  • Pear
  • Grape

This list has five values. For the first five records, as indicated by the values 1 through 5 in the id field, the GetValue() function will pull the corresponding fruit into the “column1” field.

Step 3: Filling In The Rows

This is the most flexible part of the process. It can range from complex loops within loops to a simple progression of steps to fill each row and column. First, to test if the reporting table works, you can create a simple looping script:

Set Variable[ $Limit ; Value: 100 ]
Loop
Exit Loop If [ Let( $i = $i + 1 ; $i < $Limit ) ]
Set Variable[ $$COLUMN1 ; Value: If( isempty($$COLUMN1) ; $i ; $$COLUMN1 & ¶ & $i ) ]
End Loop
Refresh Window [Flush cached join results]

This loop will fill in the first column values with numbers, by virtue of the GetValue function.

To fill in your report with real values, change the “Set Variable” line to something real. In the ContactsVL database, I wanted a count of people living in each state. I created a variable with all the states, and looped through this list. I then ran an SQL query inside the loop for each state.

#Column1
Set Variable [$$COLUMN1 ; Value: If( IsEmpty( $$COLUMN1 ) ; $State ; $$COLUMN1 & ¶ & $State ) ]
#Column2
Set Variable [ $Result ; Value: ExecuteSQL( “select count(*) from contacts where \”Work State\” = ‘” & $State & “‘” ); “” ; “” ) ]
Set Variable [ $$COLUMN2 ; Value: If( IsEmpty( $$COLUMN1 ) ; $Result ; $$COLUMN1 & ¶ & $Result ) ]
#End

In another example, rather than a list you can loop through, you have an individual requirement for the report, with text for report sub-heads. Here it’s just a matter of stepping through each row and setting the variable $$COLUMNx with the result. By adding values to a growing list, you then can parse this list into the rows of the reporting table. And for the subheads you can use text formatting functions to add bold and increase the font size to make these stand out.

Conclusion

Once you have set up a reporting table as in steps one and two, this technique can be used for many different kinds of reports. With a simple starter reporting table and a script or two ready in a template, you can quickly expand this concept to specific solutions.

Filed Under: Scripting Tagged With: Demo file, ExecuteSQL, Reporting, Virtual list

GTRR and the Modal Dialog Window Style

March 20, 2013 by Darren Burgess 16 Comments

Go to Related Record fails

Here is small development gotcha regarding FileMaker 12 advanced window styles. If you create modal dialog window, and then execute a Go To Related Record (GTRR), creating a new window from the modal dialog, the window style of the new window must also have an advanced style set to modal dialog.

You might want to do something like this if you present a modal dialog to the user and then need perform some off-screen processing in a new window after the user performs an action in the dialog. If the new window is not set properly as a modal dialog, the GTRR script step returns an error 1 (User Canceled Action) and the GTRR does not occur. Seems to me this error message does not clearly lead to an understanding of the actual cause of the error, hence the gotcha.

Step-by-Step

Here are a few screen shots to explain. This first image demonstrates the error that occurs when the GTRR new window is not specified as a dialog:

Go to Related Record fails

Now, let’s modify the script and specify a modal dialog for the new window in the GTRR script step:

Go to Related Record in modal window

And when we run the script again the GTRR step can successfully execute:

Go to Related Record succeeds

Documentation

Although the error code is perhaps misleading, this scripting behavior is not undocumented.  In the FileMaker’s online documentation of the advanced window style feature does offer an explanation (see second to last bullet point):

Any operations that attempt to open a non-dialog window will fail when a dialog window is open (for example, using a script trigger to run a script from another file). OnTimer scripts do not execute when a dialog window is open unless the script launches from the dialog window itself.

Filed Under: Scripting Tagged With: FileMaker 12

Getting Started With QuickBooks Integration

March 18, 2013 by Anders Monsen 2 Comments

QuickBooks online reference

Intuit’s QuickBooks software organizes business information: contacts, vendors, orders, invoices, inventory, and more. Many small businesses use QuickBooks as their back-end accounting software, alongside FileMaker. Often these two applications perform separate functions, but sometimes there may be a need to integrate them. I’m going to briefly discuss one such method, using the FM Books Connector plugin (FMBC) from Productive Computing, Inc.

I’m not going to cover installing the plugin, as the files that are included with the plugin explain how, in detail. However, there are some items to keep in mind:

  • FMBC runs only on Windows.
  • You need to have QuickBooks, install the plugin into the directory FileMaker uses for plugins, and run a couple of installers that allow for XML communication via QuickBooks. Again, PCI has step by step instructions for this process, along with a sample database that will test each required piece.
  • QuickBooks needs to be up and running when you make your queries, edits, and new record requests from FileMaker.
  • Finally, the plugin needs to be registered, or it can run in demo mode for 30 days, though you will run into time-out issues while in demo mode.

On Screen Reference Manual

All requests from FileMaker to QuickBooks are made through function calls from FMBC. These function calls are contained in scripts, and at the very basic, contain steps to open the connection, send data, return a result, and close the connection. Exactly what takes place in the “send data” part depends on your action, and any data returned may include confirmation, content from QuickBooks, or an error message. FMBC comes with a handy developer’s guide, but this is only half the picture. The other half entails understanding Intuit’s On Screen Reference Manual and deciphering the XML tree.

QuickBooks online reference

Each action such as adding a customer, adding a sales order, editing a customer, etc., are grouped under Message Types.

QuickBooks message types

When you select from the list of types, you can view all the requests, response, and XML tree. To initiate a request and retrieve a response you need to know the XML element names (i.e., field or column names) in QuickBooks. For example, adding a customer would require something like this:

QuickBooks XML response

Certain elements are required; others are optional, but they must appear in the same order in your script as in the XML tree. Otherwise, your result will contain an error message. Luckily, FMBC comes with a few sample scripts, and the developer’s guide contains some sample code. As with many plugins, sending the action can be accomplished by setting the value of a global field or a variable with the function call, such as:

Set Variable [ $Result ; Value: PCQB_RqNew( "CustomerAdd" ) ]

Which Message Type?

The key in writing any scripts to query QuickBooks via FMBC is first to outline which message type will be used, and then review the XML for fields in QuickBooks. From there, map the QuickBooks’ fields to fields in FileMaker, and use the appropriate function call. So, adding a customer, for example, requires the “CustomerAdd” message type and the Name. Salutation is an optional value that comes after the CompanyName. Additional fields and values are then added as needed.

Once you add the various fields you want in your new customer action, you need to deal with the result. You cannot assume the data will be added, so handling errors is critical. Successful calls via the plugin result in a zero, so checking for this value or !!ERROR!! text when sending the data will indicate whether something is wrong. If an error is detected, FMBC includes a handy function called PCQB_GetStatus that will show the error message. A few of the errors I’ve run into include values expected in the wrong order, the QuickBooks application being unavailable, and certain pre-conditions or values not in place.

Along with adding records, FMBC lets you add related records, query existing data, and edit data already in QuickBooks. Here are a couple of example “gotchas” to keep in mind.

  • Adding sales orders requires that items already exist in QuickBooks’s Item Inventory table.
  • Certain elements like Terms and Ship Via options will need to exist in QuickBooks when sending over sales order data, if you include these values. This isn’t always spelled out, and must be deciphered from error messages.
  • You will want to make sure updates only are attempted from Windows. So if you run a cross-platform user environment, prevent the scripts from running on the Mac OS using the Get( SystemPlatform ) function.

Integrating QuickBooks with your FileMaker solution is quite feasible, but requires careful planning and testing. Once you set up the first few connections and learn how to interpret error messages the process begins to flow more smoothly. An alternative to running each script in debug mode is to create an error table and log any errors there as they occur. Then you can review the table and pin point where the script faltered. Integrating FileMaker and QuickBooks should hopeful reduce data entry (no longer in both places), and allow users a friendly front end method to create and update data, and also to see the data you already have in QuickBooks.

Filed Under: Scripting Tagged With: Integration, Plugin, QuickBooks

Evaluating Script and Layout Performance

December 26, 2012 by Darren Burgess 6 Comments

Layout variables to check performance

I recently implemented a customer reporting project using the Virtual List technique.  In this particular implementation, the customer needed to create a cross-tab style project schedule report with 28 columns and about 30 rows.   Each “cell” in the report is an unstored calculation field that would evaluate a repeating global variable to derive the correct value to display.  Creating the report is a combination of scripting to load values into the global variable and reevaluation of the 800+ unstored calculations in the report.

Concern for the performance of the report drove me to explore ways to evaluate performance.  In particular, I wanted to be able to evaluate script performance vs calculation performance so that I would know where I would get best performance improvements for my efforts.

Evaluating Script Performance

Evaluating script performance was a simple matter of setting the start and end times to variables and calculating the difference at the end of the script.  At the beginning of the script I use the undocumented function Get ( UTCmSecs ) to get milli-second timestamp precision:

Set Variable ( $startTime ) = Get ( UTCmSecs )

At the end of the script I simply calculate the difference between start and end times and divide by 1000 to get a script run-time duration:

Set Variable ( $$TotalScriptTime = ( get ( UTCmSecs ) - $startTime ) / 1000 & " Seconds"

Note: Make sure the data viewer is not visible when evaluating script performance.  The display of fields and variables in the data viewer will slow script execution down.

Evaluating Layout Performance

Of course, script run time is only part of the story.  I also wanted to know how long it took for the report to reevaluate all of the 800+ unstored calculations.  This would provide excellent information into what report elements (scripts vs calcs) had the biggest effect on performance.  I was also curious as to how different calculation methods affected performance (like ExecuteSQL vs FileMaker functions based on relationships).

To accomplish this I added three global variables to the layout:

Layout variables to check performance

The start and end time global variable objects have conditional formatting applied.  Since conditional formatting evaluates whenever a layout is refreshed, I can use a Let () function in the conditional formatting calculation to create and calculate global variables.  I can further take advantage of the stacking order of objects in the layout and cause the start time variable to be calculated first by applying “Arrange/Send to Back” to the start time variable in layout mode.  Here is the calculation used in the conditional formatting for the start time:

Let ( $$reportStartTime = Get ( UTCmSecs ) ; "" )

The end time and total time are created with conditional formatting as well.  In this case, the end time variable’s position is set using “Arrange/Bring to Front.”  Lastly, the position of the total time variable is also moved to the front. Here is the calculation applied in the conditional formatting of the end time:

Let ( [
  $$reportEndTime = Get ( UTCmSecs ) ; 
  $$reportTotalTime = ($$reportEndTime - $$reportStartTime) / 1000 & " Seconds" ] ;
"")

I tuck these globals away into the undisplayed right-hand margin of the layout and slide them out whenever I have the need to evaluate the performance of the report layout.  These global variable objects are easily copied to other layouts or solutions where performance evaluation is needed.

By the way, it takes about 3.3 seconds for the report scripting to execute.  The report unstored calculations take about 0.015 seconds to reevaluate.  Of course, it seems clear that the scripting of this report is the best target to consider when looking at how to improve its performance.

Filed Under: Scripting Tagged With: Performance, Virtual list

The Power of Insert From URL

July 27, 2012 by Anders Monsen 17 Comments

Insert From URL with ISBN number

When FileMaker Pro 12 launched earlier this year, it seemed that most attention centered on a new layout design surface and the ExecuteSQL function. One example of ExecuteSQL emerged at DevCon 2012: Inspector 4.0, a database analysis tool created in FileMaker, dropped the number of table occurrences from 626 to 135 and the number of relationships from 591 to 81, largely through the use of ExecuteSQL. Yet there is another feature of FileMaker Pro 12 with vast power, at least in terms of integrating with external applications: the “Insert from URL” script step.

In previous versions of FileMaker Pro, interacting with web sites to pull data required a method called web scraping. This usually meant loading a web page in the web viewer, using the function GetLayoutObjectAttribute inside a script to pull the HTML source (after waiting to make sure the page loaded all the way in the web viewer), and then parsing the HTML with text functions.

If you wanted to get XML from web pages, such as geolocation from Google or Yahoo pages, I discovered that the web viewers and XML were not ideal for cross-platform solutions, as it generally failed to provide the XML in a Windows environment. In one instance where I needed geolocation information using Google’s KML, I had to resort to importing XML from a URL and parsing it with XSLT, a method that no longer works in FileMaker Pro 12.

With the “Insert from URL” script, it’s now possible to supply a URL in script and have it load the contents into a field.

Insert From URL with ISBN number

Once you run this script, the web page data loads into the specified field. This data can consist of values like HTML, XML, JSON, and images. This enables FileMaker to interact with a variety of web pages, and opens up interaction with REST-ful APIs. For example, you could parse Google’s weather API, its host of other APIs or many other services.

Years ago, I built a library database to track all my books. This database started as a .fp5 instance, moved to .fp7, and recently again to .fmp12. I rebuilt the FileMaker 12 version from scratch to take advantage of FileMaker Go. When adding new books I don’t want to type everything, but with FileMaker Go I can scan the ISBN and bring this into the database, then using Insert from URL I can build HTML string and pull virtually all the data I need from Open Library into my book database. This includes the ability to pull in book cover art into container fields. (Stay tuned for a more detailed breakdown of this database, including the ability to scan a barcode and import all book data, using the fmp:// protocol.)

Most APIs that function this way return data as either XML or JSON (although it appears XML is on the way out), and this requires parsing out the values into the corresponding FileMaker fields. The Open Library API returns JSON or JavaScript as the response format. Unfortunately FileMaker 12 does not have native XML or JSON parsing functions (feature request anyone?), so we still need to use text parsing functions. Andy Knasinski’s ExtractData custom function is ideal for most XML parsing, except it doesn’t pull values in the element attributes, only the values between the tags, or the element itself (defined here). When parsing JSON, other problems arise, as JSON still doesn’t appear to have one single standard. Makah Encarnacao’s article on parsing JSON didn’t address all the nodes that I needed, so I ended up writing my own custom function to get the title, author, cover image URLs, etc. Even then, the JSON sometimes varies when covers are absent, or when book data contains more verbose information.

Yet in the end, the key take-away remains the same: the “Insert from URL” script step greatly simplifies previous web scraping methods, and allows for a greater interaction between FileMaker and web services. I look forward to exploring this new tool in greater detail.

Filed Under: Scripting Tagged With: Insert From URL, Integration, Web service

12 Days of FileMaker 12: Window Styles

May 21, 2012 by Susan Fennema Leave a Comment

In this session from the 12 Days of FileMaker 12, we explore enhancements to the New Window script step that provide much more control over new windows and their behavior.

Window Styles in FileMaker Pro

This video discusses techniques for utilizing the new “Specify Advanced Styles” option to configure mode-less document windows and modal dialog windows. We explore new features and look at a variety of examples and possibilities.

About Chad Adams

Chad Adams is the presenter. He is an Information Analyst with Skeleton Key, a FileMaker Business Alliance Platinum level member in St. Louis, Missouri. He is a FileMaker 7,8,10 and 11 certified developer and has been working with FileMaker for over 15 years. Over that time, he’s run his own business, worked for FileMaker consulting firms and has been an in-house developer.

Chad has authored articles for FileMaker Advisor Magazine and is a regular contributor to the Skeleton Key YouTube channel. His biggest and most enjoyable job is being Dad to his 11-year-old daughter and 9-year-old son.

Filed Under: Scripting Tagged With: 12 Days of FileMaker 12, FileMaker 12, Video

  • 1
  • 2
  • 3
  • Next Page »

Let’s get started on your project.

It will be more fun than you think.

Get in Touch

  • Company
  • Services
  • Results
  • Blog

Copyright © 2023 · Parallax Pro Theme on Genesis Framework · WordPress · Log in