Project

NetLogCom

Enhancing Usability in Environmental Monitoring: Redesigning the NetLogCom Interface


My bachelor thesis, in collaboration with Seba Hydrometrie GmbH & Co. KG and de-build.net POWER GmbH, focused on analyzing and redesigning the NetLogCom software to improve usability for novice users. By implementing user-centered design principles, I developed a high-fidelity prototype. Usability testing with 35 experts demonstrated significant improvements in effectiveness, efficiency, and satisfaction compared to the original interface.

Summary


Responsibilities

  • Usability analysis

  • Cognitive walkthroughs

  • High-fidelity prototyping

  • Visual Basic programming

  • User-centered design

  • Design iteration

  • User testing

  • Data analysis

  • Environmental monitoring expertise


Bridging Complexity and Usability: Enhancing Environmental Monitoring Tools

Background

The increasing frequency of extreme weather events such as floods underscores the urgent need for effective environmental monitoring systems. Tools like the NetLogCom, a high-performance data collector for parameters like water levels and flow rates, play a critical role in hydrology, meteorology, and wastewater management. However, the complexity of its interface, originally designed for technical experts, limits its accessibility to a broader audience.

This project, conducted in collaboration with SEBA Hydrometry GmbH & Co. KG and deBuild GmbH, aimed to make NetLogCom more user-friendly by redesigning the software interface based on usability principles outlined in ISO 9241-11. The redesign focused on improving effectiveness, efficiency, and user satisfaction, enabling novice users to operate the software confidently.

By leveraging user-centered design, cognitive walkthroughs, and usability testing, the project highlighted how thoughtful interface optimization can reduce training costs, improve productivity, and minimize support needs. This approach not only aligns with industry trends but also emphasizes the critical role of usability in addressing global challenges like climate change.

This initiative demonstrates the potential of usability-focused research to bridge the gap between complex systems and diverse user groups, empowering organizations to respond more effectively to environmental crises.

Overview window of the reworked prototype of the NetLogCom Software

Identifying Barriers: Cognitive Walkthroughs in NetLogCom Usability Analysis

Analysis

The analysis phase of the NetLogCom project employed Cognitive Walkthroughs, a well-established usability evaluation method, to uncover user challenges in interacting with the software interface. By simulating the cognitive processes of five carefully crafted personas—ranging from novice to expert users in hydrology and environmental technology—the walkthroughs identified critical pain points across nine task sequences. These tasks, encompassing both basic and advanced functionalities, were mapped to 59 individual steps and analyzed for usability roadblocks. In total, 97 usability issues were categorized by severity, from minor cosmetic concerns to critical barriers, including mislabelled buttons, insufficient visual feedback, and confusing workflows. These findings provided actionable insights, driving targeted solutions to improve effectiveness, efficiency, and user satisfaction.

ERROR REDUCTION

HELP + SUPPORT

SYSTEM STATUS

CONSISTENCY

BUTTONS

DROPDOWNS

MINIMALISTIC


From Concept to Reality: Prototyping the NetLogCom Redesign

Prototyping

The prototyping phase of the NetLogCom project was driven by the decision to develop a completely new software version, rather than iterating on the existing application. This approach was chosen due to the extensive usability issues uncovered during the cognitive walkthroughs, making a fresh start more efficient and sustainable.

A high-fidelity prototype was created, simulating real-world interactions and closely resembling the final design and functionality of the software. The development process was guided by proven principles of human-computer interaction, including Nielsen’s heuristics, Shneiderman’s golden rules, and Norman’s user-centered design philosophy. These principles ensured the redesign addressed the identified issues, enhancing usability, consistency, and error prevention.

The prototype design began in Figma, where iterative discussions with developers refined its feasibility and user focus. The finalized designs were implemented in Visual Basic, aligning with the original software’s programming environment to streamline backend integration. Key improvements included:

  1. Menu Bar: Redesigned to include subcategories like "Configuration," "Settings," "Functions," "View," and "Help," consolidating less frequently used features into a dedicated area to improve consistency and declutter the interface.

  2. Status Panel: Positioned prominently to display login status (Online/Offline), a 30-minute countdown timer, device name, current date, and time. Profile and language settings were added with recognizable icons to improve usability and visibility of the system status.

  3. Button Panel: Enlarged and centrally aligned buttons for essential functions such as retrieving, sending, and saving parameters. New buttons like "Send Current Parameters" and "Undo" were added, with confirmation pop-ups implemented to reduce errors and provide immediate feedback.

  4. Navigation Panel: Simplified to include only essential functions such as "Overview," "Measurement Channel," "Data Collector," "Access (Serial)," "Limits," "Output," and "Language." This streamlining enhances ease of navigation and reduces user overwhelm.

  5. Measurement Channel List: Displays only active channels with color-coded indicators—green for normal operation and red for issues. Added buttons for creating new channels or deleting the last channel to improve functionality and control.

  6. Measurement Channel Settings: Organized into clearly defined tabs ("Channel," "Measurement," "Cross-section," "Statistics") with guided dropdown menus and input fields to minimize errors and improve efficiency.

  7. Info Panel: Introduced hover-based tooltips and explanatory text for interface elements, complemented by a status bar to enhance context-awareness and provide continuous system feedback.

Prototype of NetLogCom Software showcasing the different panels.

Usability Tests

Usability Testing: Validating the NetLogCom Prototype

The usability testing phase of the NetLogCom project was essential to evaluate whether the newly developed software prototype outperformed the original version in terms of usability. With 35 experts from the hydrology and environmental engineering fields participating, the study focused on assessing the software’s effectiveness, efficiency, and user satisfaction. Participants brought diverse professional backgrounds and varying levels of experience with the NetLogCom system, ensuring comprehensive feedback.

A within-subjects design was used, where each participant tested both the original and prototype versions of the software. This approach minimized individual variability and provided robust comparative data. Tasks were carefully designed to reflect real-world use cases, including retrieving active parameters, configuring new measurement channels, managing data collector channels, and modifying language settings. Randomizing the order of software versions helped mitigate potential learning effects.

The usability tests measured three key dimensions. Effectiveness was evaluated by task success rates, critical error counts, and the need for assistance during tasks. Efficiency was assessed through task completion times and the participants’ perceived workload, using the NASA-TLX mental workload scale. Lastly, user satisfaction was measured with the ISONORM 9241/110-S questionnaire, which explored areas like ease of use, learnability, and adaptability.


Results

Evaluating the NetLogCom Prototype

The usability testing of the NetLogCom prototype yielded detailed insights into its effectiveness, efficiency, and user satisfaction compared to the original software version. By combining quantitative metrics with qualitative feedback, the results highlighted significant improvements introduced by the prototype, as well as areas for further refinement.

Quantitative Findings

The quantitative analysis revealed statistically significant enhancements in the prototype's usability. The effectiveness was evident in higher task success rates for complex configurations, such as channel creation. The prototype also recorded a notable reduction in critical errors and user assistance requirements. Efficiency improvements were demonstrated through reduced task completion times, particularly for complex workflows. Participants experienced significantly lower mental workload, as measured by the NASA-TLX, indicating reduced frustration and cognitive strain. The ISONORM 9241/110-S questionnaire further confirmed superior satisfaction levels, with participants consistently rating the prototype higher across all usability dimensions, including task alignment, error tolerance, and intuitive navigation.

Qualitative Findings

The qualitative data provided deeper insights into user experiences. While the original software drew criticism for unclear navigation, unintuitive interactions, and a lack of feedback, the prototype addressed many of these issues. Participants appreciated the clear, consistent layout and new features like dropdown menus and real-time status updates. One participant noted, "The dropdown options make configuration so much simpler," reflecting the effectiveness of the updated interaction design.

However, areas for improvement were also identified. Some users found element labels confusing, such as the operational mode names, and desired more intuitive time-editing functions. Feedback suggested enhancing the visibility and accessibility of the info panel, as nearly half of the participants overlooked it despite its usefulness. Recommendations included larger text, color coding, or pop-up notifications to increase its prominence.