(**** * * See accompanying text in semi-formal-spec.me, in the Section titled "Two * Ways to Model ScheduleEvent". Languagewise, what's going on here is the * long-sought-after parallel decomposition/composition of inputs and outputs * of parent ops into and out from child ops. I.e., a composite obj into a * parent op can be decomposed into its pieces, to be supplied to child op * inputs (I don't actually remember the graphical notation for this, but I do * recall that it exists, at least informally). Coming out of the child ops, * their individual outputs can be composed into the larger parent op output. * * E.g., in the example in this file, we have CalendarDB coming into the parent * op DoScheduleEvent. The UserCalendar component of CalendarDB is decomposed * out to be fed into the input to sub-operation ScheduleEvent. Going out, the * original OtherStuff component of the input CalendarDB is composed with the * UserCalendar output of ScheduleEvent to produce the CalendarDB output of the * parent op. Get it?? * * An important point to make in this context is that we assume certain * top-level objects to be persistent, in that they are available to be used as * direct-data inputs in dataflow. It may be that the proper way to model * these formally is to declare them as top-level global variables, but I need * to consider this more fully to be sure. Lurking in here is potential issue * of dataflow semantics as def vs imple. Specifically, in the context of this * example, should GetCurrentCalendar as a DF node take CalendarDB type or * (persistent) CalendarDB value. I think the answer is the latter. * * Further, we should well consider putting data-store and user input symbols * (back) into the DFD notation. In the textual dataflow notation, a declared * global var can be used to denote a data-store. To denote a user input * textually, we might try adapting the old Daisy II question-mark notation, * define (perhaps at least partially as built-ins) user input aux functions, * or make the assumption that any terminal-symbol input/output is from/to the * user. As an example of the last style, the Event type input to * DoScheduleEvent (below) would be considered a user input. Perhaps the rule * can be better stated as "any input or output that is not defined in the * dataflow semantics as coming from or going to another op/function * considered a user input/output". In the DFD, we can highlight such * input/output using an explicit user input/output symbol (I recall a * trapezoid or some from of arrow-shaped box). * * IMPORTANT QUESTION: Can the dataflow attribute in an operation feed its own * inputs and outputs with a constant value, or do these have to come from some * context-setting operation above? E.g., in the specific case below, can * DoScheduleEvent have these dataflow clauses: * * ? -> e, -- User input event * TheCalendarDB -> ucal, -- Persistant caldb object * * The answer to this question might be that the df attr for an op can feed its * own inputs and send its own outputs from/to constant values. Hmm, this may * lead us to rethink that idea that all df defs connections have to be "from * above", in that it may be reasonable for any op to wire its own * input/output, as long as we realize that doing so makes the op less general. * I.e., when an op can feeds/directs its own input/output, it can be used only * in one specific way. Another way to do this might be with default * input/output values, i.e., the "=" textual way. * * I'm not yet fully sure about this, but I'm pretty darn close. In * particular, the code below contains an affirmative answer to this question, * using the "might be" answer ideas. * *) (* * A persistent CalendarDB value. Need to figure out exactly how or if this * fits into dataflow semantics. The long-lurking thought is that it could be * used as the "loop-back" value from incremental ops produce a CalendarDB as * output back to incremental ops that take one as an input. It may well be * that such "loop-back" objects are just the persistent-data form of a * dataflow loop, such that both forms have the same semantics. *) var TheCalendarDB:CalendarDB; op ScheduleEvent(ucal:UserCalendar, e:Event)->ucal':UserCalendar description: (* Approach 1: Define ScheduleEvent more purely functionally by having it take a UserCalendar. Use dataflow to decompose the larger context CalendarDB object into the necessary UserCalendar input. DFD *); end; op DoScheduleEvent(cdb:CalendarDB, e:Event)->cdb':CalendarDB components: ScheduleEvent, ... ; dataflow: TheCalendarDB -> cdb, ? -> e, cdb.user_calendars[1] -> ScheduleEvent.ucal, e->ScheduleEvent.e, {cdb.stuff, ScheduleEvent.ucal'} -> cdb', cdb' -> TheCalendarDB; description: (* Approach 1a: Use selector and index ops to decompose cdb input down to UserCalendar component, which is the required input to ScheduleEvent. DFD *); end; op DoScheduleEventAltB(cdb:CalendarDB, e:Event)->cdb':CalendarDB components: ScheduleEvent, ... , GetCurrentCalendar, GetStuff; dataflow: TheCalendarDB -> cdb, ? -> e, cdb -> GetCurrentCalendar.cdb, GetCurrentCalendar.ucal -> ScheduleEvent.ucal, e->ScheduleEvent.e, cdb -> GetStuff.cdb, GetStuff.stuff -> ComposeCalDB.stuff, ScheduleEvent.ucal' -> ComposeCalDB.ucal, ComposeCalDB.cdb -> cdb', cdb' -> TheCalendarDB; description: (* Approach 1b: Use auxiliary dataflow functions instead of selector/indexer ops to get at necessary UserCalendar. DFD *); end; function GetCurrentCalendar(caldb:CalendarDB) -> ucal:UserCalendar = caldb.user_calendars[1] description: (* Aux function to use in dataflow in lieu of built-in data access operators. *); end; function ComposeCalDB(stuff:OtherStuff, ucal:UserCalendar) -> cdb:CalendarDB = {stuff, ucal} description: (* Aux function to use in dataflow in lieu of built-in data access operators. *); end; function GetStuff(cdb:CalendarDB) -> stuff:OtherStuff = cdb.stuff description: (* Aux function to use in dataflow in lieu of built-in data access operators. *); end; op DoScheduleEventAltC(cdb:CalendarDB = TheCalendarDB, e:Event)-> cdb':CalendarDB = TheCalendarDB components: ScheduleEvent, ... ; dataflow: GetCurrentCalendar(ucal) -> ScheduleEvent.ucal, e->ScheduleEvent.e, ComposeCalDB.cdb(cdb.stuff, ScheduleEvent.ucal') -> cdb'; description: (* Approach 1c: Use a combination of dataflow and textual function invocation, the idea being to have operations be dataflow nodes, and auxiliary functions be invoked textually from within dataflow connection clauses. Also, persistent data i/o are denoted with default i/o parameter values, and relying on the convention that any unhooked i/o is from/to the user, which in this case is the Event input. *); end; op ScheduleEvent(caldb:CalendarDB, e:Event)->ucal':CalendarDB description: (* Approach 2: Define ScheduleEvent to take the larger context CalendarDB directly. *); end; (* * IMPORTANT NOTE: We need to define the preconditions and postconditions for * each of the above ops. *) (* * Stub objects *) obj CalendarDB = stuff:OtherStuff and user_calendars:UserCalendar* description: (* Stub *); end; obj UserCalendar description: (* Stub *); end; obj OtherStuff description: (* Stub *); end; obj Event description: (* Stub *); end; op DoScheduleEventAlt2(cdb:CalendarDB, e:Event)->cdb':CalendarDB post: cdb' = {cdb.stuff, [ScheduleEvent(cdb.user_calendars[1], e)] + cdb.user_calendars[2:#cdb.user_calendars]}; description: (* Approach 3: Constructive pre/post (or just post) version of preceding dataflow semantics. *); end;