In the course of my work on a customer project, I discovered that the links to externally stored container fields with a FileMaker Server 12 backup can break if the backup is not restored properly. After trying a number of different scenarios, I have come up with a list of circumstances where container fields will remain intact and when they won’t.
All examples are using a Windows based FileMaker server and apply equally to both secure and open external container storage. Understanding these situations will help you properly restore FileMaker 12 files that use external container storage.
In case you need to get some basic knowledge about FileMaker Server 12 and the new features related to backup and enhanced container fields, here are some great resources to get you started:
- Using Container Fields in FileMaker 12 (log in to FileMaker TechNet to download)
- FileMaker Server 12 Configuration Guide (also found at TechNet)
Circumstances where external container data will break (container fields will show files as missing):
- Uploading FMP12 backup files with the FileMaker Server admin console.
- Opening backup files in FileMaker Pro — ie, not hosted by a server
- Opening files in FileMaker Pro that were removed by the FMS admin console
Circumstances where external container data will remain intact:
- Copying FMP12 backup files, with the respective container data folder from the RC_Data_FMS folder, to the FileMaker server data directory, then opening the files with the admin console.
- Copying backup files, as above to a different FileMaker server
- Downloading the files with FMS admin console, then opening the files in FileMaker Pro (e.g., not hosted). In this case, the databases and remote container file directories are downloaded together, so references to externally stored container fields are preserved.
- Downloading the files with FMS admin console, then uploading to another server, or the same server, with the admin console. Again, the act of downloading the files brings the container file directory along with it. Uploading these same files will restore the container data on the FileMaker server.
And we’re back to the good old days of modifying permissions for files copied to the FileMaker Server live database folder. While not a problem for a server administrator, it does make things confusing for users who are not used to working with permissions. I really liked when the Admin console started taking care of permissions automatically via uploading. You would think it would handle external container fields for us automatically. Maybe that is coming in a future release.
Thanks for stopping by, Taylor. We are hoping for an improvement in that regard as well. Would be better if the admin console upload process would recognize and restore the backed up container data when uploading a backup database to the server.
While this is technically correct, I don’t think it tells the whole story. The listed circumstances where external container data will break can all be prevented by moving the directory with container data up two folder levels.
FileMaker Server looks for container data in this directory:
[database location]/RC_Data_FMS/[database name]/
FileMaker Pro/Admin Console look for container data in this directory:
[database location]/
So, if trying to upload a FMS backup via the admin console, first copy it out of the backup directory (so you don’t mess with the files in your backup), then move everything from:
[database location]/RC_Data_FMS/[database name]/
to here:
[database location]/
Thanks for the additional info Dan and making a contribution. I would add that you say to copy the files – this is the correct action to take as opposed to moving them. When copying, the OS will properly access the hard-linked back up files. (hard linking occurs because FMS now creates hard-linked copies to files that have not changed, saving backup time and space.
It seems to me that FileMaker Server ( admin console ) has trouble with uploading very large files.
Under the hood, the admin console zips the file + external data, and if the zip file size exceeds 2GB, it has trouble unzipping it back again on the server. My experience on a Win 2008R2 x64 server, it could be different when using OSX, I didn’t test that yet.
If this turns out to be a correct observation, I see problems arising when restoring backups using the admin console.
Peter,
That is really excellent additional information, thank you. I suspect that with very large files it may be best to move them manually to the server. Have you tried such?
Hi Darren,
I experienced the problem before reading this excellent blog article. It surely would have helped.
So indeed, after experiencing the unzip error I tried copying the 6 GB of external PDF data to the server – not using the admin console, to discover that the containers did not load anymore. Dan’s remark probably explains why this did not work, allthough it could have been me not paying enough attention to the exact path required.
I ended up uploading the only the file, and then adding the PDF’s again through a FileMaker script, uploading them that way. Not really optimal, but suprisingly fast. Since that worked for me, I did not examine any other way. I have spent a lot of time watching uploads that day…:-)
Well, I am glad the information was at least belatedly useful to you! Thanks again for contributing your experiences on this article. It benefits everyone.
I just experienced 5 days of FileMaker Container Hell. For 6 months my solution was working perfectly with about 8 external container fields…a few that were encrypted.
Along came 13.05 Server patch and suddenly all my container fields displayed “Missing…”
I didn’t notice this for a few days and then when I did I couldn’t identify what was the trigger to the problem nor how to solve it.
Apparently that Server patch redirected FileMaker’s gaze explicitly to the RC_Data_FMS folder, instead of the root folder, which is where I’d stored my subfolders.
For a short term solution…For now I just reverted to internal storage.
I strongly believe FileMaker should utilize the same protocol for FMAdvanced and FM Server. Problems like this would be avoided.
I am using RefreshFM to update a remote hosted file. Is it best to download the old file from the admin console and then import data into new file on a local machine, then upload to admin console through the FMP Advanced? Or should I import the data directly from the hosted file? ~ THANKS!
Simon – What are you trying to do? Convert “by reference” containers to “external storage”?
I am importing data from one file to another. One file is live on a hosted server with data and the new empty file is on my computer. After I import data into the new file and upload it I don’t want to lose the secured external links to files.
Simone – “External storage” depends on the physical location of the database. If the “new” database is not hosted on the server, it cannot access files in external storage.
Yes, I understand the external storage cannot access unless the file is on the hosted server. I am importing data from the live version to a newer version and then putting back on the server. I just want to make sure I will not lose those links to secured files when putting up the new file with the old data back on the server. I have an internal import script that imports the data, but also use RefreshFM to import old data to new version.
It sounded like from the above descriptions this would break the links to the external files. I do not want to “download” the database from the admin console since it appears to download all the external files with it. We have external files because they are very large and would literally take days to download.
I hope this makes more sense.
Thanks!
Simone – Without knowledge of the architecture of your solution, I cannot provide a definitive answer. I would suggest either a) contact Goya to inquire whether RefreshFM supports this scenario or b) setup a development server with a backup of the solution for testing.
Thanks for responding 🙂
In my opinion Filemaker has a horrible bug in this area. If I need to work on a file that has remote containers I always work on a local copy to make sure things are working right before I make it live. You cannot replace a file using the console that already exists you must delete it first. When you delete it filemaker thinks, well you don’t really need the remote data so I will just move it to a different location.?? Now you upload your newly modified file and nothing works because the data has been moved.
No warning, no question if I really want the data moved, breaks an existing solution. A big ol fat bug IMHO. If you have only remote access to a hosted solution you probably will not have enough permissions to move the data back to where it never should have left.
I cannot imagine why it ever needs to move without my permission or me asking it to move.
Bug, Bug, Bug, Bug, Bug