Solution discussion
The biggest concerns here are that:
- the diagram showed
- a reasonable breakdown of the product into processes/smaller components
- the entities (users and external systems) who interact with the product as it runs
- the data stores used
- clear directional flows of data transferred between each of the above
- clear/distinct labels on everything above
- the accompanying short (single sentence) descriptions for each of the above
- the most common issues encountered were
- trying to embed aspects of the SDLC or standards, version control processes, etc
into the product DFD (the DFDs are about the makeup of the actual product: not how the
goes about building/maintaining it)
- trying to embed ERD content into the DFD (for the final exam be sure you're clear on
the different purposes of the different kinds of diagram)
- not providing enough of a decomposition of the product into processes
- not providing one or more of the requested elements (entities, data stores, etc)
- not clearly labeling and describing the parts of the diagram (vague or unlabeled data
flows in particular)
|
Solution discussion This one depends a lot on the product itself, and the mark was split more or less equally between the descriptions of challenges and the associated recommendations. Some common challenges/recommendations include: Sheer volume/complexity of tests needed to ensure product meets requirements: - use of any support tools/sw (from language, platform, etc) - in house software/scripts to augment - suggested team processes to mitigate the test load Testing the UIx elements/aspects: - ui test tools, recordings+playback - user action scripts, results recorded for future evaluation Dealing with deliberate randomness (e.g. games): - seeded tests (based on whatever seeding mechanism the tool/language/platform provides) Networking (establishing/maintaining connections, bandwidth, response time, etc): - simulated loads - simulated connection loss - use of netw test tools (depending on the tool/language/platform) |