Oracle support in test

The test version of the Anchor Modeler has been updated with preliminary Oracle support. The Oracle generation was kindly donated to us by the Swedish bank SBAB. Many thanks to Mikael Backlund (SBAB) and Jonas Åhgren (Xanthe IT) for the work they put into the Oracle generation! It is not a full implementation of uni-temporal yet, so if someone wants to step in and continue the work, please let us know. It is also largely untested by us, so report back if you find something that is broken.

We have also refactored the Layout Engine, in order to make it self-contained and easier to break out and use in other projects. A new feature called equivalence is also being added to uni-temporal (with concurrent-reliance-temporal to follow). Please note that this is still experimental and not yet complete, and the exact implementation may change. Equivalence can be used to create for example multilingual or multitenancy databases. Simply put, it covers use-cases where you may have simultaneous and equivalent values in attributes and knots. We introduced the concept in the presentation: Time, back in 2011, but haven’t had any implementation of it until now.

0.98 updated in test

The test version has been updated to 0.98. In this version CRT modeling has had a number of bug fixes applied, which also made it to stable today as version 0.97.4. One of the major features implemented so far in 0.98 is a switch from canvas drawing to native SVG manipulation. The canvas drawing, while being very performant, proved to be too difficult to extend. As browsers have matured, SVG has almost caught up in performance, and with it adding new graphical primitives is much easier.

As a result of this, event handling had to be rewritten, so quite a lot has changed in the interface even if it looks deceptively alike the canvas version. Therefore we would appreciate if you report any anomalies you may find in 0.98. Some rarely used features had to go, like the drop shadow effect. Others have been refined, like the graphical debug of the force directed graph layout algorithm.

This is just the first of many new features in 0.98. Stay tuned!

Version 0.97 RELEASED

Today we released version 0.97 as stable. This brings along a lot of changes and new features. Foremost among these is the switch to a new SQL code generation engine, based on our sisula template language, and of course using it the new concurrent-reliance-temporal modeling. This technique makes it possible to build a ‘perfect memory’ which is equivalent to having a time machine, such that you can travel back to any moment in time (both forwards and backwards) and see what the information looks like. Furthermore, you can see the information through the eyes of anyone and with what reliability they state a fact.

Since the whole SQL generation is new we would not recommend moving databases running 0.96 in production to 0.97 before having done thorough testing. The old version, 0.96, can still be reached here. With this we are also moving on to 0.98 in test. We have a number of exciting things planned for 0.98. Stay tuned!

The 500th revision

We are celebrating the 500th code revision (the first was in 2010), which is now available in the test version of the modeler. With this we are also doing a feature freeze for version 0.97, only fixing potential bugs, with the intent to move it to stable soon. After all, there are already a number of databases and data warehouses using that codebase in production.

Noteworthy features in 0.97 that have been added since our last post (rev 480) along with numerous small fixes and improvements:

Concurrent-reliance-temporal modeling
The CRT generation is feature complete and has been tested, but could use more testing before we can say that it is “bug free”. If you have the possibility, please help us out with this. CRT will make it possible to time travel over your information in three dimensions: over changing time (versions), over positing time (corrections), and over positors (origins). This will definitely be the next big thing in modeling, as everywhere we have shown it, people have realised how it solves problems they either had discarded as too difficult to solve or had poor workarounds for.

Business views
The views provide a simplified naming and shows only values and keys for the different perspectives latest, current, point, and difference. It should make it easier for those who want to expose the Anchor model directly to end users like analysts or report builders. With business views the need for marts is even less than before, and we expect that many will be able to get rid of the additional maintenance layer that marts constitute.

Checksums
Some of the larger data types in SQL Server come with limitations, such as being able to take part in an index or be compared to each other using the equals operator. In order to use such types, for example varchar(max), text, or geography, you can now tick a checkbox and an MD5 checksum will be generated and used instead in indexes, triggers, and comparisons.

First Ever Certified Anchor Modelers

The first ever class of Certified Anchor Modelers graduated last week! It was a great three days of classes with engaged and skillful students, whose intellects were put to the test when it came to managing the three dimensions of time travel possible with Anchor. Traveling over changing time and positing time for different positors using concurrent-reliance temporal modeling proved to be no problem for the students though, and everyone passed the certification in flying colors. Congratulations to you!

We also added a Directory of Certified Anchor Modelers which can be found by clicking here. Let us know if you want/need any of the information in the directory updated.

Here are a couple of photos from the class and the surprise dinner cruise we went on:
IMG_1642IMG_1638

Anchor Modeling Certification Course

We are proud to announce the first public Anchor Modeling Certification Course, to be held in Stockholm on February 5th to 7th. Please see the details below for how to sign up for the course and following certification. The course is held in central Stockholm, and we can help recommend accommodation for foreign participants.

Instructor: Lars Rönnbäck

Duration: 2 days of lectures and workshops followed by ½ day of assisted examination.

Cost: €2500 + VAT

Location: Stockholm, Sweden, hosted by UpToChange (www.uptochange.com).

Date: February 5th to 7th, 2014. There are still a few seats available, so we are extending registration until February 3rd.

Contact: sales@uptochange.com

Edit: Some have reported trouble reaching this mail address. If so, you can use lars.ronnback@anchormodeling.com or lars.ronnback@gmail.com instead.

Target Audience: Data Modelers, Enterprise Architects, Data Warehouse Architects, Business Intelligence Architects. A prerequisite for the class is to have a general familiarity with and understanding of database modeling concepts.

Certification Validity: Your certification will get a version number, specifying which version of the certification you took. A certification of a certain version is valid for life. If and when enough things have changed in Anchor Modeling to warrant a new version of the certification, the version number will be increased and people can decide if they want to recertify for the new version or not.

Course Description: The course will cover both uni-temporal (the original) and concurrent-reliance-temporal Anchor Modeling from theoretical and practical points of view, with students building their own models using our modeling tool as exercises. After the course and certification you will be equipped with the knowledge necessary to architect your own solutions using Anchor Modeling.

The course will introduce bitemporal concepts used to handle versioning and corrections of information, while retaining a complete history of such changes. The reliability of information and how such can be used to solve otherwise difficult modeling challenges is discussed, and introduced in one of the exercises. Storing concurrent and opposing views of the same information but with different origins is also handled. The course will also demonstrate how “time traveling” is possible through the use of parametrized views, making it possible to ask complex temporal questions through simple SQL, and how these queries gain excellent performance thanks to modern query optimization techniques. Other advanced topics in Anchor Modeling, such as evolving models over time, design patterns, refactoring, and management of unknown values are handled in the course.

Background: Anchor Modeling is an agile modeling technique particularly suited for data environments that change over time. It has been developed in a collaboration between the Swedish industry and the Dept. of Computer Science at Stockholm University. With close to ten years since the first implementations, along with awarded Open Source tools and scientific papers, the technique is now gaining world wide momentum. Anchor Modeling can be implemented in traditional relational databases and it is based on current research in entity relationship modeling, database normalization and temporal databases.

About the instructor: Lars Rönnbäck is one of the authors of the Anchor Modeling technique. He is working as a senior consultant at UpToChange and is a research affiliate with Stockholm University. He has been working with some of the largest companies in the Swedish insurance and retail businesses, is an avid presenter at conferences, and has more than ten years experience from the field of Business Intelligence.

The Iceberg Exercise

We have developed a new exercise that will be used in our Anchor Modeling classes. It has been designed to show many aspects of concurrent-reliance-temporal modeling. The idea is to model a database for “An Iceberg Tracking and Drift Prediction System“. The document can also be found under the Publications menu. The entirely fictional system has the following objectives and goals.

Objective
To track and predict the movements of icebergs in order to avoid collisions with vessels
traveling on or below the oceanic surface, and allow for better route planning in the
presence of icebergs.

Goals
Hourly track the position of every iceberg larger than 1 square mile and accurately
predict their positions in 1, 3, 5, 10, 15, and 30 days. Areas free from icebergs are also
reasonably well predicted up to half a year in advance. This data should be made
freely available on the Internet.

v0.97 updated in test

We have updated the test version to revision 480, in which you can begin to test the concurrent-reliance-temporal implementation for attributes. Ties will follow in a later test version. (Update October 9th: we are almost there now!). The following fixes and additional features have been made or added since the last test version (r453).

  • r480
    Corrected information in about.html. Now pushing to test.
  • r479
    Adding tie insert and update triggers. Fixing some issues with the generated CRT SQL. Really fixed the zeroing velocity bug this time. Pushing this to test.
  • r478
    Zeroing velocity when stopping nodes, yielding a slight performance tweak and better layout.
  • r477
    Stabilizing generated CRT code. Now executes without error. Triggers on ties are still not done.
  • r476
    Continuing work on CRT sisula templates. Tie rewinders done. Perspectives started.
  • r475
    Continuing work on CRT.
  • r474
    Added tie restatement constraints.
  • r473
    Fixed SQL generation bugs. CRT code now executed without errors.
  • r472
    Continuing work on CRT.
  • r471
    Starting work on triggers in CRT. Broken generation now.
  • r470
    Better index alignment.
  • r469
    Updated about.html. Pushing to test (again).
  • r468
    Added default positor (=0) and _Positor table for holding all positors.
    Rewrote latest perspectives to become views again, using the positors table to
    determine which positors are available.
    Changed partitioning to be based on positors, and made it work.
    Cleaned up sisula files and moved defaultCapsule to NamingConvention.js.
    Fixed bug that only made unix files work in sisula, due to CRLF handling.
    Fixed bug in partitioning.
  • r467
    Updated about.html with revision number. Pushing to test.
  • r466
    Table elimination is now in effect for CRT, but still untested whether results are the same as with the previous logic. If no temporalization is present in the XML, default will now be whatever is specified in the Defaults.
  • r465
    Added (generated) date and time metadata attributes to the XML.
  • r464
    Fixed the bug with Defaults overwriting settings for ties and anchors as well.
  • r463
    Fixed bug where Default setting of restatements and idempotency would overwrite anything explicitly specified in a loaded XML file.
    Pushing this release to test, since this bug was affecting some users.
  • r462
    Trying various approaches for table elimination.
  • r461
    Added a GenerateDropScript procedure in the Schema Tracking sisula. Trying to get table elimination for CRT to work. Changed granularity of the changing time in the example model.
  • r460
    Fixed bug in key generation SP (only one key could be generated). Added extra null checking and defaults in the insert trigger. Continuing work on CRT.
  • r459
    Continuing work on Concurrent-Reliance-Temporal (CRT) Modeling. Reverted the temporal PK change. Added attribute rewinders. Changed the restatement constraint to make use of rewinders.
  • r458
    Added restatement constraint for attributes. Fixed temporal primary key (entity integrity).
  • r457
    Added assembled views for attributes in the CRT generation.
  • r456
    Starting work on concurrent-reliance-temporal modeling. Knot, anchor, and attribute tables are created correctly.
  • r455
    Forgot to change VERSION to 0.97.
  • r454
    Fixed typo in transformToFragment.

Help us improve SQL Server

While working with SQL Server, we have identified a few shortcomings that we have reported to Microsoft. To raise awareness for these feature request we would appreciate your help in voting for their importance. In order to do so, follow the links below, log in with your Microsoft ID and click on the green box with an up arrow.

Introduce Substitute Primary Keys in the Optimization Phase

When parts of a primary key is used in a calculation, the result of that calculation may be used instead of a column that is part of the primary key, and still fulfill all the requirements of being a primary key. For example, if the primary key in a table is (id, stamp) and a view is created as select id, stamp, row_number() over (partition by id order by stamp desc) as ver… then (id, ver) can be used as a substitute primary key for (id, stamp). The query optimizer could use such information in order to do table (join) elimination in the case that the view is joined on id, ver instead of id, stamp.

 

Consolidate Execution Plan Paths

When the execution plan contains multiple paths with identical sections, these could be consolidated and only executed once, instead of once per path. See the example below where three paths could be consolidated, with a substantial gain in performance.

 

Elimination of Multistatement Table-valued Functions from Execution Plan

In certain circumstances, when a primary key is defined in the table returned from a Multistatement Table-valued Function, the table does not have to be materialized at all, since it does not take part in the execution. This is similar to the already existing table elimination done between tables that have primary keys (and foreign keys) defined. Please see the attached script for a scenario in which the table is “unnecessarily” materialized.

 

Also, if you can figure out workarounds, let us know. Thanks!

Examples have been added

We have added a new section to the site called ‘Examples’ in which you can find an example database and example loading scripts. With this we want to show different ways to load data into an anchor model. The scripts can be used as templates for making your own loading logic, and as the example models covers most different types of modeling patterns, it should be rather complete. Find the pattern in the example that matches your situation and copy and modify those scripts.

We will add more examples in the near future to make this collection as complete as possible.

Click here to go to the examples.