-
Notifications
You must be signed in to change notification settings - Fork 4
Home
See http://www.inf.ed.ac.uk/teaching/courses/cs2/LectureNotes/CS2Ah/SoftEng/se02.pdf
- The System shall have the ability to pull only partial information from the source. I.e. quantities only, not the whole geometric BIM model.
- Structure the model and subdivide according to contractual obligation.
- Provide ability to split the model objects in new ways (ie a side of a building slab into individual floors);
- Ability to share updates to the server without sharing with other actors until a permission to share project-wide is given.
- Admin
- User (any user): Planner, Design Coordinator/Project Manager, Consultant/Engineer, Contractor, Architect, Field Operator
- Software Developer
- Observer (can see data but cannot commit)
As a <role>, I want <feature> so that <reason>.
As a Planner, I want to associate planning activities to design objects so that I can plan activities in planning software (e.g. Tilos, Primavera P6, Bentley Synchro, Microsoft Project, etc.).
This includes quantities take off, object definitions, schedules and plans in software such as Tilos, Primavera P6, Synchro, etc. Ultimately, the user wants to see how model changes affect the activities.
- Planner opens all the BIM models and federates them (most likely in Solibri or Navisworks or 3D Repo);
- Create a planning view from the given federated models;
- Create relevant planning activities;
- Export into a sub-plan and save the configuration in order to be able to re-apply.
As an Estimator, I want to create a model view on the design scope so that I can bring it into an estimation software and create a cost estimate.
Software such as CostX and Ribi2.
As a Design Coordinator/Project Manager, I want to collect and federate the latest BIM models so that I can assign design tasks/issues back to the individual Consultants/Contractors to resolve.
As a Consultant/Contractor, I want to be told what issues to resolve so that I can improve my design and deliver a new iteration.
This user requirement effectively defines a design feedback loop whereby the models are regularly updated and federated in systems like 3D Repo. There, the issues are collected and passed back onto the Consultants/Contractors to resolve and then the whole process is reiterated.
As a Design Coordinator/Project Manager, I want to define data compliance checks so that individual Consultants/Contractors can self-validate Deltas before sharing.
Standard approach especially in the contractor world is to define Employers Information Requirements (EIRs) and BIM Execution Plan which specify the BIM data requirements on a project. Currently, there is no automated way to validate these have been met when exchanging data.
As a Consultant, I want to define the accuracy/certainty/LOD of data/options so that other actors know to what degree to rely on the information.
This reflects the Levels of Detail and Levels of Information in BIM projects. For instance, many objects could be labelled as placeholders at the early design stages. However, if this gets implemented, then we also need the ability to automatically validate that all placeholders have been replaced by the end of the project.
As a Consultant, I want to have multiple representations of the same object so that I can run different analysis for different purposes.
It is important that the system records the relationships between the objects, be it siblings or parent-child rels. Often there would be different representations of the same object for different purposes. A steel beam in architecture would be represented differently to a structural analysis calculation for lateral stress vs top load, etc.
As a User, I want to have different view of the same data so that I can concentrate only what is important to me at any given time.
As a User, I want to set notifications trigger that is specific to me so that I get only notified of the changes that I care of.
- US7: As a Field Operative, I want to update progress on site within the model so that the project can be controlled and monitored transparently by all other stakeholders/Users.
This should help with invoicing (cash turns) and also having As Installed Model ready made.
-
US8.1: As a User, I want to decide when I receive updates and when I synchronise with the master data so that I am in control of my workflow.
-
US8.2: As a User I want to have the capacity to get a preview of the changes before synchronising so that I do not damage my working copy, should the proposed update not be applicable.
-
US9.1: As an Admin, I want to ensure Developers adhere to the standard so that all new software is being interoperable.
-
US9.2: As a Developer, I want to have good documentation so that it is easy to adhere to the standard and develop.
-
US10: As a User, I want to have different branches/streams/options so that I can be in control of the architecture of data sharing (e.g. in order to mirror to the scope of works on a project).
Results that are not necessarily functional requirements but would be the expected side effects of the project implementation.
- The software connected to AEC Deltas should eventually have the ability to recalculate whatever analysis, design, planning, etc. it is doing automatically/under the control of users, whenever a single change occurs and a delta event is triggered to keep up to date as appropriate.
Technical requirements:
- Files to be decomposed into object level components that define individual deltas across applications. Each change should be recorded and streamed to the server. There is the possibility that for a peer-to-peer implementation the local application would store all deltas in a sequence locally as a shadow copy (to be discussed) that can be synchronised to;
- Individual changes (deltas) to be streamed one at a time in a sequence. This is to ensure synchronised delivery since the order matters, unlike the time when a change has occurred.
- File-format independent delta definition that can support IFC but also any other data representation such as BHoM;
- Two end-points to be able to synchronise over time to the latest state. The most difficult part will be the order of actions and the need to either implement locking or conflict resolution, see here: 3D Diff
- End-to-end data encryption including at rest. This is the ensure that data is available only to those users that have the appropriate decryption key;
- Scalability to large infrastructure projects. The proposed solution needs to be applicable equally to buildings as well as linear assets. After all, on the data model level, a highway is like a skyscraper, just sideways;
- Ability to exchange either object definitions or triangulated meshes where required. 3D Repo provides some functionality to generate geometry on the fly from object definitions thanks to the IFCOpenShell library but assuming authoring tools provide access to generated meshes, those should be a preferred way of retrieving data (due to computational overhead and time constraints);
- Ability to identify the author of each change via cryptographic signature. Every delta should be attributable to an individual working on the project as an undeniable proof;
- Permissions-based access control. Only users with the right access privileges are allowed to access data. The question is whether we should account for access to some but not all data such as federations of multiple disciplines, exclusion of sensitive information (eg CCTV models), etc;
- Centralised mapping of fields/elements across different applications with in-built versioning. Each mapping API should be properly versioned and distributed to all parties form a centralised location so that new inputs can be tracked and audited.
Defining scope of project:
- Horizontal asset delivery (design & construction phase) as per RSRG work in Rail: traditional or Slabtrack - design information
- Focus on new build, with lesser priorities on the "Operate phase"; De- prioritise Re-furbish; these will be dealt in the Project evaluation
- Must include pre-construction and construction workflows, including the simulation of construction processed:
Process and data needs will be detailed as part of the WP2 in M2 and M3:
- Designing to tender level & planning approval (Grip stage 3 to 4)
- Designing to construction level & construction approval (Grip Stage 4 to 5)
- Enabling grip 5 information into Grip 3 stage - design for fabrication / early contractor involvement
- Quantifying of material quantities, temporary works, and labour content during GRIP stages 2 to 6 by enabling data exchange between design tools and estimation tools (Bentley and Autodesk to CostX, Tylos and Rib)
- Planning of timescales, site access, material flows during GRIP stage 4, 5, 6 by enabling information exchange between desing tools and planning tools (Bentley and Autodesk to Synchro and Tylos)
- Linking of tool outputs to dashboarding and visualisation tools to present / optioneer outcomes at global KPI level
See https://www.transwilts.org/images/pdf/Guide_to_Rail_Investment_Process-1.pdf for GRIP stages definition
- The format to support versioning or not?
- Random access, read parts of the dataset (jump to a specific location or just the latest updates)
- A file (local) representation and a server-side representation
- USD file with a random access, is it useful? Works well for huge datasets. See https://graphics.pixar.com/usd/docs/Usdz-File-Format-Specification.html
- Revit, Archicad, SketchUp, Rhino, CityEngine. Low priority: Tekla, Catia, 3ds Max, Blender
See also: https://www.nic.org.uk/wp-content/uploads/Data-for-the-Public-Good-NIC-Report.pdf
Technical Requirements:
- Any new adapter interface/protocol will need to provide both server/client and also server-less behavior as required by end user. Requires replicated architecture of the server for when off-line (queuing of CRUD/IO operations to then be pushed when back online);
- user is in control to work locally or push to server for remote access;
- Object History is recorded;
- Infrastructure to enable (and not preclude) full version control must also be possible
- Object level diffing (reduction of payload);
- combined with Object property level comparisons;
- the combination of which allows the potential for efficient comparisons/diffing with inclusion of tolerances also;
- data format for protocol should be flexible and allow completely schema-less data - minimal requirements on header/diff for each data packet to be a technical implementation requirement and thus no such formatting requirements should be imposed as a bare minimum for the end user;
- However in addition - possibility that conforming to a defined schema will enable further capability of system (i.e. extraction of a geometry property(s) or predefined metadata schema?!);
- The format for diffing/comparison maybe dependent on the data type? See above;
- objective to minimise payload on each push;
- store snapshots vs diffs vs actions of user;
- Following the latter of the above - storing actions of user, combined with data source etc. to enable "Data flow tracking/query/viz";
As a knowledge provider partner, UCL puts forward the following that can serve as a base/prior work for the future development of the user requirements above & those mentioned in the grant work packages. The below are not "user requirements" per se, but more "technical requirements/background" based on the speckle platform which was previously developed as part of UCL within an H2020 grant.
- Authorship: the speckle server implements a discretionary access control layer that assigns an owner to each created resource (object, object collection, etc.). This is implemented via the speckle's api authentication mechanism.
- Encryption (SSL) is usually handled by the proxy server using standard
letsencrypt
certificates (and is not part of the implementation) - Encryption at rest is usually handled by the database (mnogodb) deployment (and is not part of the implementation). See the following documentation.
- Regarding container specification: see below.
Speckle already has an api contract written in OpenApi v3, coupled with generated documentation of the endpoints.
Delta updates and diffing is currently managed at the client level and operate on the object collection rather than individual objects. These can be enhanced to contain more granular update notifications about single/multiple delta updates.
Nevertheless, it is important to note that various client (CAD software) implementations will require different types of diffing to occur based, hence we recommend not implementing these behaviours server-side beyond a common-sense approach.
Publisher-subscriber design:
- The speckle server exposes both the REST API mentioned above, as well as a WS server
- All clients connect as either publishers or subscribers to the WS server on a resource-based classification, given permission access levels.
- The server propagates messages to all subscribers, but is also able to handle individual client-to-client messages.
Distributed architecture: The speckle server is stateless - all api calls are authenticated and WS messages are coordinated across all server instances.
- Speckle is schema agnostic, supporting out of the box several object models developed internally by various companies.
- Furthermore, there is a planned integration with the BHoM object model.
- Speckle Server
- Speckle Api Specifications
- Speckle Client Integrations:
- Rhino & Grasshopper
- Dynamo
- Revit, Unity, GSA, and others are WIP