Data Fabric IG: Broker Driven Core Component Workflows
You are here
The Data Fabric IG (DFIG) identified that working with data in the many scientific labs and most probably also in other areas such as industry and governance is highly inefficient and too costly. Excellent scientists working on date intensive science tasks are forced to spend about 75% of their time to manage, find, combine and curate data. What a waste of time and capacity. The DFIG is therefore looking at the data creation and consumption cycle to identify opportunities to optimize the work with data, to place current RDA activities in the overall landscape, to look what other communities are doing in this area and to foster testing and adoption of RDA outputs.
The Brokering IG is examining the overarching concepts and frameworks for achieving interoperability across different disciplinary data systems, leaving the systems autonomous and avoiding pushing them towards any specific implementation model. Such technological solutions are commonly called brokering frameworks. They generally provide mediation and adaptation services, in a transparent way for the Users –either intermediate or final. Brokers may play a facilitating role in the landscape depicted by the IG Data Fabric and act as one of the main “fabric components” described there, especially as to data find, access, evaluation and use.
This page pulls together activities across these two IGs to determine if we can develop a set of workflow patterns, using brokers to mediate between the core components of the Data Fabric to implement these workflows.
Working Meeting Scheduled for P9 (Wednesday 5 April, 2017 Breakout 3, 16:00 - 17:30)
Agenda for Session in Barcelona is scheduled for Breakout 3 5th April at 16:00
16:00-16:10 Short introduction (Stefano/Jay)
16:10-16:20 Biodiversity use case (Dimitris/Wouter/Donald)
16:20-16:30 DKRZ Climate Science use case (Tobias)
16:30-16:40 Workflow Diagrams (Peter/Bridget)
17:15-17:30 Wrap up/next steps
- 3125 reads