JSON-Fortran : A modern tool for modern Fortran developers
Why on earth am I still writing in Fortran ?
Despite popular belief, Fortran is alive and well. In the past I've been asked questions like "Who even writes libraries in Fortran?" or (my favorite) "Why aren't you writing this in C?" Though a publication on age demographics of Fortran developers is still wanting, their is definitely a perception that we're an older group. I'll speak for myself in the hopes to embolden other Fortran developers to stand-up for this incredible compiled language that still drives a large portion of scientific computing.
Me ? I'm just hitting 30 this year and Fortran was the first programming language I picked up through Florida State University's Scientific Computing program in my undergraduate year's in applied mathematics. I've been writing Fortran since 2009 for a variety of applications, my favorite being SELF-Fluids. This code started as a series of homework assignments from a graduate course on Spectral Element Methods with Dr. David Kopriva. SELF-Fluids naturally became a hobby project where I could experiment with modern Fortran features, such as Coarrays and derived type polymorphism, GPU and multi-GPU acceleration with CUDA-Fortran. Some ideas stuck in this project while many did not.
To remain a modern Fortran developer, rather than becoming legacy, once a year I make it a point to take a look at new standards and tools that have bubbled up and begin experimenting and refactoring all over again. I guess I appreciate impermanence. This year, I'm planning on tackling a number of tasks
- Swapping out the CUDA-Fortran kernels with AMD's HIP
- Data-structure reorganization to promote code reuse and improve maintainability and extensibility
- Implementing time dependent background and boundary conditions for the compressible fluids solver
- Research and implement high-order method standard output format (enhance current parallel HDF5)
- Enhance automatic testing and complete a CI/CD pipeline
- Swap out namelists for JSON input
These last two are part of an effort to get SELF-Fluids cloud-ready, where it can ultimately provide "Fluid Simulations as a Service". In the last couple weeks I've been diving into JSON-Fortran, an open-source Fortran API for reading and writing data in JSON format. By coupling JSON-Fortran with an equation parser I wrote in Fortran last year, I'm looking forward to improving the user interface quite a bit.
JSON-Fortran in SELF-Fluids
So far, my experience with JSON-Fortran can be described as "skeptical-to-intrigued-to-impressed". The JSON-Fortran code takes advantage of a wide range of modern OO-Fortran features and is well organized. For programmers that haven't worked with Fortran pointers and object oriented programming extensively, JSON-Fortran can feel like it has a learning curve. However, the developers have done an excellent job keeping documentation up to date.
I found the derived type documentation to be most useful when developing with this API. While perusing this documentation before developing, it became clear that the public routines exposed to other users followed a simple pattern for getting and adding data to JSON data, independent of the type of data you're retrieving. In looking through the source code, it was nice to see well placed usage of type-bound generic procedures and polymorphism to make this possible.
Module testing enabled by JSON-Fortran
I decided to pick up JSON-Fortran in a small project to start developing tests for my own derived types in SELF-Fluids. The Spectral Element Libraries in Fortran (SELF) side of SELF-Fluids defines a number of classes for approximating Calculus operations (e.g. div, grad, curl) on isoparametric spectral element meshes. One of these classes is the Lagrange Class, which defines Lagrange interpolating polynomials that we can use to approximate, interpolate, and differentiate (in the calculus sense) scalar, vector, and tensor functions in one, two, and three dimensions.
When testing the routines associated with the Lagrange Class, I want to be able to specify a function to interpolate and differentiate and calculate the numerical error of approximating the function with a Lagrange interpolating polynomial. Spectral Element theory provides error convergence rates and conditions for exactness (that I won't get into here).
Essentially, the test program for the Lagrange Class will
- Ingest specifications of functions and their exact derivatives
- Approximate functions and derivatives (divergence, gradient, and curl) and estimate and report numerical errors
This will provide an isolated component for testing within the continuous integration (CI) pipeline.
How it works, by Example
As an example, the input for 1-D testing looks like
{
"n_plot_points": 12,
"polynomial_range":[1, 7],
"scalar_1d":
[
{
"name": "constant",
"function": "f = 1.0",
"derivative": "dfdx = 0.0"
},
{
"name": "linear",
"function": "f = x",
"derivative": "dfdx =1.0"
},
{
"name": "gaussian",
"function": "f = exp( -(x^2) )",
"derivative": "dfdx = -2x*exp( -(x^2) )"
}
]
}
The complete file can be found on the refactor branch of the SELF-Fluids bitbucket repository. The JSON contains an array called scalar_1d. Each item is an object with the attributes name, function, and derivative. The function and derivative attributes provide an equation string that the equation parser digests at run-time to evaluate the exact values of the functions and derivatives. The function evaluation is used to set nodal values for the interpolant and to compare with interpolated function values and derivative function values. Finally, once errors are calculated, they are returned in a json output using JSON-Fortran with the name identifier along with the function and derivative errors. Example output is shown below :
{
"scalar_1d":
[
{
"name": "constant",
"polynomial_degrees": [1,2,3,4,5,6,7],
"f_errors":[]
"df_errors": []
}
.
.
.
]
}
What does the code look like ?
You can find the Lagrange_Test.F90 program on the SELF-Fluids repository. Rather than showing the code in its entirety here, I want to highligh key routines from JSON-Fortran that I used to ingest the input JSON. First, to load the json file and build the underlying data structure in Fortran, it's as simple as making a declaration and two function calls
USE json_module
TYPE(json_file) :: json
CALL json % Initialize()
CALL json % Load(filename = './lagrange.test.json')
To be able to iterate through arrays for which we don't know the size a'priori, we can use the json_file % info routine. In my case, I've used this to determine how many elements are in the scalar_1d
array.
INTEGER :: nTests
CALL json % info('scalar_1d', n_children=nTests)
The json_file
object has json_core
and pointer json_value
attributes. Essentially, the json_value attribute is a linked list element with parent
and child
attributes. This structure allows you to dive into nested objects, like the scalar_1d array of objects in this case. JSON-Fortran provides convenient routines to access both parent and child objects through json_value
pointers through a type-bound procedure of the json_core
.
To obtain the i-th element of the scalar_1d array, I first get the json_core
object associated with the currently open json_file
and obtain the pointer to the scalar_1d array.
TYPE(json_value) :: jCore
TYPE(json_value), POINTER :: scalar1dPointer
LOGICAL :: found
CALL json % get_core(jCore)
CALL json % get('scalar_1d', scalar1dPointer, found)
Using the length of the array I obtained previously, I opened a loop and used the json_core % get_child routine to obtain a pointer to the i-th element of the scalar_1d
array.
TYPE(json_value), POINTER :: testPointer
INTEGER :: iTest
DO iTest = 1, nTests
CALL jCore % get_child(scalar1dPointer, iTest, testPointer, found)
Now that we have a pointer to the array element, we need to dive one level deeper to obtain the object attributes name
, function
, and derivative
. Again, I use the get_child routine to obtain a pointer to the attribute. Then I use json_core % get to get the value of the attribute. For example, to get the value associated with the name
key
TYPE(JSON_VALUE), POINTER :: p
CHARACTER(KIND=JSON_CK, LEN=:), ALLOCATABLE :: var
CALL jCore % get_child(testPointer, 'name', p, found)
IF( found )THEN
CALL jCore % get(p, var)
ENDIF
From here, I pass the function and derivative strings to the SELF equation parser and continue with the remainder of the error analysis.
TYPE(EquationParser) :: f, dfdx
CHARACTER(50) :: fChar, dfdxChar
f = EquationParser(fChar)
dfdx = EquationParser(dfdxChar)
What's next ?
All of this is a work in progress and I'll be building out testing infrastructure using JSON-Fortran and the Equation Parser in the nearest future. Further ahead, once SELF is sufficiently cleaned up, I'll be updating the Compressible Fluids Solver to use the new SELF updates. Further, I'm looking forward to getting away from the namelist parameters I'm currently using for sfluid
.
My vision is to be able to drive my simulation workflow through a single JSON input so that it can ultimately be integrated into more of a "pipelines" framework for PDE Solvers. On the input, I envision specifying (for example)
- A mesh generator and its inputs
- A fluids solver (like SELF-Fluids) and its inputs
- Post-processing, visualization, and data archiving details
There's certainly a lot more work to do than just switching to JSON input, though this contributes to building the pipeline. To make such a framework more useful for others, there's certainly a need to start standardizing PDE solver output formats and mesh storage formats (specifically for high order methods).