From 131178723727f44ae531610ab5cb42910bce35bc Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Sat, 31 May 2025 05:44:56 +0800 Subject: [PATCH 1/6] EFP: Add EFP 8 --- efp/efp008/main.xml | 71 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 71 insertions(+) create mode 100644 efp/efp008/main.xml diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml new file mode 100644 index 0000000..1d23843 --- /dev/null +++ b/efp/efp008/main.xml @@ -0,0 +1,71 @@ + + + + + + + + +
+ +

+ The backbone of the MUI Framework would be supported by the Ferricia Engine, with + Simple DirectMedia Layer (SDL) 3, OpenAL and OpenGL. Due to this, the interactions between + the Engine library and the Kotlin interface become essential that should be enough + efficient and effective. In this framework, the Graphical Component System (GCS) + would also be introduced, as the fundamental graphical system for the Engine's utilization. + This Framework is only applicable on client targets but not dedicated server targets. +

+
+
+
+ +

+ The MUI Framework consists of window management, rendering, audio processing, input + management, user feedback management and interactive simulations. Rendering would + mostly be handled using OpenGL with only a single canvas on only a single window; + audio processing and simulations would mostly be handled using OpenAL with sources + transmitted via networking or resource management; other elements would mostly be + handled using other SDL subsystems. +

+

+ macOS enforces that all window-related tasks, including windowing, rendering and + input events, to be handled in the main thread. Therefore, the main thread would + do all windowing, rendering and primary window event handling, with other tasks + delegated to other threads. This should ensure the application could work on every + target platform while only minimally allocating the main thread for all essential tasks. + Also, this does not include any audio-related task. +

+

+ Primary window events include any raw pointer and key inputs, all window events + sent by SDL. Those events would then be sent to GCS Canvas for event handling and + processing in the GCS ticking thread. It is important to note that the objects and + states in the GCS must be carefully handled so that they could be rendered in + the main thread asynchronously safely. +

+
+
+
+ +

+ The GCS would only be implemented in the Kotlin interface with abstractions. + Rendering of Graphical Component Objects (GCOs) would be backed by the Engine's + rendering system, using the low level Kotlin bindings. All GCOs only serve as + graphical wrappers of the underlying objects they manage, such as data entries, states, + controls and toggles. However, audio feedbacks triggered by GCOs cannot be managed + by GCOs, but another audio user interface manager. +

+
+
+
+ + +
  • EFP 3
  • +
  • EFP 5
  • +
    +
    +
    + +
    From 9b47bb4f77530fab007dcf15d9d0f415826b3e67 Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Sun, 1 Jun 2025 08:56:15 +0800 Subject: [PATCH 2/6] EFP 8: Update information about Events and GMS --- efp/efp008/main.xml | 170 ++++++++++++++++++++++++++++++++++++++++---- 1 file changed, 157 insertions(+), 13 deletions(-) diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml index 1d23843..95e2e0d 100644 --- a/efp/efp008/main.xml +++ b/efp/efp008/main.xml @@ -14,7 +14,7 @@ The backbone of the MUI Framework would be supported by the Ferricia Engine, with Simple DirectMedia Layer (SDL) 3, OpenAL and OpenGL. Due to this, the interactions between the Engine library and the Kotlin interface become essential that should be enough - efficient and effective. In this framework, the Graphical Component System (GCS) + efficient and effective. In this framework, the Graphical Model System (GMS) would also be introduced, as the fundamental graphical system for the Engine's utilization. This Framework is only applicable on client targets but not dedicated server targets.

    @@ -33,29 +33,173 @@

    macOS enforces that all window-related tasks, including windowing, rendering and input events, to be handled in the main thread. Therefore, the main thread would - do all windowing, rendering and primary window event handling, with other tasks + do all windowing, rendering and primary raw event handling, with other tasks delegated to other threads. This should ensure the application could work on every target platform while only minimally allocating the main thread for all essential tasks. Also, this does not include any audio-related task.

    - Primary window events include any raw pointer and key inputs, all window events - sent by SDL. Those events would then be sent to GCS Canvas for event handling and - processing in the GCS ticking thread. It is important to note that the objects and - states in the GCS must be carefully handled so that they could be rendered in + Primary raw events include any raw pointer and key inputs, all window events + sent by SDL. Most events would then be sent to GMS Canvas for event handling and + processing in the GMS ticking thread. It is important to note that the objects and + states in the GMS must be carefully handled so that they could be rendered in the main thread asynchronously safely.

    +
    + +

    + All the primary raw events are sent by or as a part of the SDL functionalities. + It is important to note that there exist several event types that would never be + used in the Engine. + Also, custom user events would not be handled in SDL, but mostly in the Kotlin interface, + so events like SDL_UserEvent and SDL_QuitEvent are not used. + Since touchscreen, camera and pen are not the target devices to support, + their corresponding events are also not used. +

    +

    + Since mobile platforms are not targetted, events primarily designed for mobile platforms + are also ignored. However, the mechanism to notify the user through an application warning + notification about the system or the heap is running low on memory is possible. + Still, it is expected that the application is always optimized at a certain degree that + it can still run and proceed properly without overly optimizing any important resources. + When the application is low on memory, the user is responsible to manage memory size + and reduce memory demands as much as possible depending on the customization configurations; + otherwise, there is nothing the application could do to free up memory. +

    +

    + The application window could optionally reduce rendering frequency when the window + is not active. This includes the conditions when the window is hidden, minimized, + or occluded. The maximum number of frames per second (FPS) could be limited at 30 or + the half of the display refresh rate. At any time, the window cannot render properly + and should stop rendering when the window bounds are out of a single display area; + spanning the window to multiple displays is currently an unsupported behavior for + behavior correctness and validity. + This means functions such as SDL_GetDisplayForWindow are not supported. +

    +

    + The application does not rely on system theme but its own textures and themes, + so the event of system theme change is not used; the application might use system + locale settings to identify which language to use when the game data directory is + first initialized, but the value would not change afterward, so the event of + system locale change is also not used. +

    +

    + Display events may be used only when the application is in fullscreen, but only + for the events related to the display the window is rendered on. + Otherwise, there is no need to take care of display information. + However, when the window is moved or resized, the display and window states must + be checked to see whether the window can still be rendered. + It is important to note that the controls of rendering are still handled in + the Kotlin interface, so the events must be forwarded to the Kotlin side. +

    +

    + When the window gained focus, it is assumed that all input device states + are all "zero", meaning that all states are cleared or reset to origin. + However, this must not always be true, so when the events of resetting states + are fired but the states were already "zero", the states remain unchanged, + but with warnings (recovered situations) emitted. + It is important to note that the set of states in SDL is not used due to this reason. + Also, when the window loses focus, all the input device states in the application + should be reset to origin and never get updated until the window regains focus. + When the cursor is out of the canvas's boundaries, the cursor is not regarded as + focussing on the canvas, and the mouse events are not processed until the cursor + enters the canvas's region again; the cursor becomes non-existed in the application + and hidden in canvas when the cursor is not in the canvas's region. +

    +

    + Event timestamps are ignored. + When they are polled from the SDL event queue, it is expected that they are sorted + in chronological order, so the timestamps should not be enough to affect + the synchronization states while they are pumped and polled per frame update. +

    +

    + The modifier key states in SDL are not used generally. + For text typing, this should normally already be handled by the input editor, + which is processed in other events such as text editing or text input. + This includes key repeating, which is likely fired by the device or system. + The application itself internally holds a set of key states updated by + all the primary raw events, but not the keyboard states queried directly. + Also, the game is not playable with an external on-screen keyboard tool. +

    +

    + Scancodes of the keys would always be used internally, but the keycodes + may be used to display the hotkey settings in the application. + Therefore, the keycodes in the events are ignored, but for displaying + the key to users, a different way would be done. + Since scancodes do not change on keyboard layouts, the users have to + identify what layouts their keyboards are. + This cannot simply be done by SDL functions as some keys may be undistinguishable + (for keycodes) or not mapped against layouts between different keyboards (for scancodes). + A custom-built layout key mappings could be done to correctly map the keys + by user's choice, which is by default QWERTY. + It is still a question whether SDL_GetKeyFromScancode could be used + safely across different platforms and various devices. +

    +

    + There would be two modes for mouse or cursor inputs. + By default, system cursor states are tracked and updated to cursor states, + which means the cursor motion states are updated each tick to correspond to + the cursor position retrieved from SDL. + When the game is in gameplay, the cursor mode turns to relative and wraps the cursor + in the window, most likely only at the center of the canvas. + In this mode, the mouse motion events from SDL are used instead of tracing + system cursor position. +

    +

    + It is possible to select keyboard and mouse devices to use to exclusively only + accept the input events from just one device instead of all. + However, by default, all keyboard and mouse devices would be accepted to allow + further customization depending on this. + There is a possible concern about the application may become unusable when + the devices become unusable although other devices could still control the system, + so this feature might be reserved. +

    +

    + Joystick and gamepad update completion events (SDL_EVENT_JOYSTICK_UPDATE_COMPLETE + and SDL_EVENT_GAMEPAD_UPDATE_COMPLETE) are generally not used, + since the status updates are done through polling all the events in the certain + frame. + If any issue has been encountered, such as unideal non-synchronization scenarios + or delayed events, this should be reconsidered. +

    +
    +
    -
    +

    - The GCS would only be implemented in the Kotlin interface with abstractions. - Rendering of Graphical Component Objects (GCOs) would be backed by the Engine's - rendering system, using the low level Kotlin bindings. All GCOs only serve as - graphical wrappers of the underlying objects they manage, such as data entries, states, - controls and toggles. However, audio feedbacks triggered by GCOs cannot be managed - by GCOs, but another audio user interface manager. + The GMS would only be implemented in the Kotlin interface with abstractions. + Under GMS, all Graphical Model Objects (GMOs) serve as graphical wrappers of + the underlying objects they manage, such as data, states, even other GMOs. + The hierarchy of GMOs includes Screens, Menus and Components, + where one includes the next directly without skipping any stratum. + Rendering of them would be backed by the Engine's rendering system, + using the low level Kotlin bindings. + However, audio feedbacks triggered by GMOs cannot be managed by GMOs but + an audio interface manager out of this GMS. +

    +

    + The ScreenManager manages Screens stored in a linked list, with one + on the top covers the others underneath. + The Screen on the top also blocks user interactions via Graphical User Interface (GUI). + In the rendering system, the RenderSystem renders the Screens managed + by the ScreenManager onto the Canvas backed by the Engine. +

    +

    + Rendering of content is not affected by system's display scale; the application itself + would include a GUI scale setting to scale the sizes of Menus and Components. + However, this only affects the dimensions but not all the margins. + In the implementation, margins would most likely use percentages or ratios. + Still, the details of implementation about GUI scaling are still subject to confirmation. +

    +

    + It is complex to query the window safe area for rendering when it varies device by device. + Therefore, it should always assume that the safe area would generally be the canvas area + with paddings of 10 pixels, to make sure the rendering behavior is consistent enough + across platforms. + This only applied to important interactable or rendered content, general displayed content + like the background, miscellaneous margins, paddings and borders are not affected by this.

    From 4a2852825c1db7cb0e0844c0fd1ea37a682091c1 Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Sat, 7 Jun 2025 04:01:01 +0800 Subject: [PATCH 3/6] EFP 8: Add info about Rendering Framework --- efp/efp008/main.xml | 85 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 85 insertions(+) diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml index 95e2e0d..259681f 100644 --- a/efp/efp008/main.xml +++ b/efp/efp008/main.xml @@ -203,11 +203,96 @@

    +
    + +

    + The entire rendering framework is backed by OpenGL, so the details are mostly based on + the features and functionalities provided by OpenGL. + Basically, the framework aligns the rendering pipeline of OpenGL and general rendering + procedures, such as shaders and textures. +

    +

    + On GUI, the camera is fixed, so the view is also fixed and constant. + Therefore, in shaders, the view matrix and transformation are omitted and disregarded. + The projection should always be orthogonal and thus being a uniform value, while + the model transformation would be done per instance as an attribute, but not per vertex. +

    +

    + There would be several sets of texture atlases implemented in the Engine. + Texture atlases should be used to enhance performance since there would mostly be + a large number of texture units used in the game; otherwise frequent texture bindings + which are likely expensive operations may occur. + In general, one might use + half pixel correction + to solve the issue, but the outcome may not be ideal or fit perfectly with + the rendering results due to the clipped outermost pixels in a tile as + a texture atlas element (later referred as "texel"). + Therefore, it is more recommended to have paddings or border pixels for each texel + in an atlas to keep the outermost pixels rendered well. + Even using "nearest neighbor" filters without paddings may not help solve the issue + completely because of potential numeric rounding errors. + All GUI textures would be managed with an atlas without any mipmapping, while all tile, + item and entity textures would be managed with different atlases with mipmapping. +

    +

    + Different textures would have different constrains resulting different configurations. + Since GUI textures are not supposed to be scaled down, but scaled up depending on + the scaling setting with the base layout set based on the lowest reference scaling, + mipmapping is not suitable for this. + On the other hand, the gameplay region may be scaled up or down, so mipmapping may be + applied to enhance the performance of rendering diminished textures. + Regardless of whether the atlas would be mipmapped, paddings would be done by repeating + the outermost pixels to the desired extent for each texel. + However, when the atlas is mipmapped, the padding size would be the number of pixels by + two to the power of the maximum mipmap level with the atlas, so paddings still take effect + for each mipmap. + When the atlas reached the hardware-limited size (known by GL_MAX_TEXTURE_SIZE), + a new atlas for the same configurations should be used, but stored together in a managed set. + Also, manual mipmapping for each texel is required to avoid edge bleeding during mipmapping; + computations of texture atlases would be performed and designed with reference to + + the rectangle packing algorithm by David Colson + or crates like rectangle-pack and crunch-rs; unless + the texels are explicitly constrained to only powers of two, this kind of + advanced algorithm would still be used. +

    +

    + In general, since the art style tends to be pixelated, mostly "nearest neighbor" + would be used to magnify the texels, but blending would be applied for diminishing to + simulate the similar colors. + At least, the mipmaps would be weighted using linear interpolation, but color filtering + could be either one of the available options through settings or launching options due + to rendering quality and computational complexity. + The maximum mipmapping level would be determined by the greatest power of two from + the common factors of all texel dimensions in the specific atlas, for which could be + further limited through settings or launching options. + Basically, there would still be the enforced constraint to the maximum value like four; + the dimensions of tile texels and that of entity texels must be a multiple of 8 and 2 + respectively. +

    +

    + Besides, the application would first enter the early loading stage to render the progress + by initializing minimal components required to show the progress bar and the loading screen. + So, the logo used in the loading screen would be embedded in the application, and other + components would be started loading and tracked after the initialization of the screen. + The loading screen may also show during loading of updated resource pack configurations. + Moreover, a short animated splashscreen could show before the loading screen. +

    +
    +
  • EFP 3
  • EFP 5
  • +
  • + Stack Exchange Game Development: "How to avoid texture bleeding in a texture atlas?" + answered by danijar +
  • +
  • + Stack Exchange Game Development: "How can I prevent seams from showing up on objects + using lower mipmap levels?" answered by Panda Pajama +
  • From fa2b1c18696837429343b7dfaa567d484704504b Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Sun, 8 Jun 2025 06:51:12 +0800 Subject: [PATCH 4/6] EFP 8: Add info about other rendering details --- efp/efp008/main.xml | 57 ++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 56 insertions(+), 1 deletion(-) diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml index 259681f..2623bb2 100644 --- a/efp/efp008/main.xml +++ b/efp/efp008/main.xml @@ -201,6 +201,15 @@ This only applied to important interactable or rendered content, general displayed content like the background, miscellaneous margins, paddings and borders are not affected by this.

    +

    + Most graphical elements should be based on textures and the default shaders used with textures. + Only if certain geometric shapes are really required, like dynamic or variable lines and curves + used to show interactions and associate interactive elements on the GUI, the restriction + to use only textures could be bypassed. + However, it is still recommended to use customizable properties like colors via resources. + In this case, the default shaders which take textures would not be used, but the shaders + that take other attributes. +

    @@ -215,7 +224,8 @@ On GUI, the camera is fixed, so the view is also fixed and constant. Therefore, in shaders, the view matrix and transformation are omitted and disregarded. The projection should always be orthogonal and thus being a uniform value, while - the model transformation would be done per instance as an attribute, but not per vertex. + the model transformation and color filter would be done per instance as uniform values + evaluated each frame depending on the specifics, but not per vertex.

    There would be several sets of texture atlases implemented in the Engine. @@ -278,6 +288,51 @@ The loading screen may also show during loading of updated resource pack configurations. Moreover, a short animated splashscreen could show before the loading screen.

    +

    + Any GMO would internally hold an opaque handle managed by the Engine. + When a Screen is sent to the active stack, the handle in the Engine would be sent to + the internally held state, RenderingState, to be used during rendering. + The state internally holds an empty Mutex lock which controls which thread + to allow access to only one thread for thread-safety. + Ideally, only the GMS ticking thread and the main thread, which performs rendering, + would access the state either for modifications or reading for rendering. + The state impacts the thread for modifications only when the GMOs are within the state, + where the state would internally mark it during transitions. + This pipeline could greatly reduce JNI calls during rendering to improve performance + with only one call per loop as the general ideas. + Note that synchronization must be carefully examined for any data accessed in the Engine. +

    +

    + In general, rendering of any GMO should avoid reading framebuffer data drawn by + any previous GMO in the rendering loop. + This includes applying visual effects like color inverting or copying; + this is due to parallelization of shader computations in the render devices. + (See the related question.) + Various optimizations are also conducted, like reusing same shader programs as much + as possible when the previous draw call was also using the same program, eliminating + unnecessary OpenGL commands. +

    + +
    +
    + +

    + As a general idea, the minimal support version is OpenGL 2.0 with some extensions required. + Although at the time this is written, most consumer devices in the last decade should + support OpenGL 3.0 and above, 2.0 is still targeted as the minimal version for most other + potential users. + However, when a newer version or usable extension is detected with the OpenGL context, + advanced features that may improve performance or even other effects might be available. +

    +

    + According to Wikipedia, the GLSL version in OpenGL 2.0 is 1.10, so version 110 would be used + as the base GLSL version to use for all the shaders used in the Engine. + If a more advanced GLSL version is supported while advanced features are used, the target + GLSL version should be specified and the shaders are applied only when the target version + is matched for both GL and GLSL versions. + This is said, some shaders might be used or available only when they are available provided + by the OpenGL context. +

    From f013ef4ab8b11290727a987da46e3690c48693ba Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Thu, 12 Feb 2026 23:39:34 +0800 Subject: [PATCH 5/6] EFP 8: Rewrite MUI EFPs Software Multimodal User Interface (MUI) Framework --- efp/efp008/main.xml | 357 +++++++--------------------------------- efp/efp008/overview.svg | 57 +++++++ 2 files changed, 115 insertions(+), 299 deletions(-) create mode 100644 efp/efp008/overview.svg diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml index 2623bb2..926f84b 100644 --- a/efp/efp008/main.xml +++ b/efp/efp008/main.xml @@ -1,7 +1,6 @@ + efp="8" created="2025-05-30" category="standard" status="draft" title="Software Multimodal User Interface (MUI) Framework"> @@ -10,346 +9,106 @@
    +

    - The backbone of the MUI Framework would be supported by the Ferricia Engine, with - Simple DirectMedia Layer (SDL) 3, OpenAL and OpenGL. Due to this, the interactions between - the Engine library and the Kotlin interface become essential that should be enough - efficient and effective. In this framework, the Graphical Model System (GMS) - would also be introduced, as the fundamental graphical system for the Engine's utilization. - This Framework is only applicable on client targets but not dedicated server targets. + Generally speaking, a software or computer may contain User Interfaces (UIs), + as Graphical User Interface (GUI), Acoustic User Interface (AUI), Haptic User Interface (HUI) + and Kinesthetic User Interface (KUI). All of them require Human Interface Devices (HIDs) + as the media of interactions between the user and the software.

    - The MUI Framework consists of window management, rendering, audio processing, input - management, user feedback management and interactive simulations. Rendering would - mostly be handled using OpenGL with only a single canvas on only a single window; - audio processing and simulations would mostly be handled using OpenAL with sources - transmitted via networking or resource management; other elements would mostly be - handled using other SDL subsystems. -

    -

    - macOS enforces that all window-related tasks, including windowing, rendering and - input events, to be handled in the main thread. Therefore, the main thread would - do all windowing, rendering and primary raw event handling, with other tasks - delegated to other threads. This should ensure the application could work on every - target platform while only minimally allocating the main thread for all essential tasks. - Also, this does not include any audio-related task. -

    -

    - Primary raw events include any raw pointer and key inputs, all window events - sent by SDL. Most events would then be sent to GMS Canvas for event handling and - processing in the GMS ticking thread. It is important to note that the objects and - states in the GMS must be carefully handled so that they could be rendered in - the main thread asynchronously safely. + Each UI plays a role in the MUI Framework by their responsibilities. + Altogether provides the complete UI framework where users may interact with. + Different UI may interoperate differently with different counterparts to integrate + the roles to achieve various features users use.

    -
    +

    - All the primary raw events are sent by or as a part of the SDL functionalities. - It is important to note that there exist several event types that would never be - used in the Engine. - Also, custom user events would not be handled in SDL, but mostly in the Kotlin interface, - so events like SDL_UserEvent and SDL_QuitEvent are not used. - Since touchscreen, camera and pen are not the target devices to support, - their corresponding events are also not used. + The GUI is the most basic part of a UI to display information to users.

    - Since mobile platforms are not targetted, events primarily designed for mobile platforms - are also ignored. However, the mechanism to notify the user through an application warning - notification about the system or the heap is running low on memory is possible. - Still, it is expected that the application is always optimized at a certain degree that - it can still run and proceed properly without overly optimizing any important resources. - When the application is low on memory, the user is responsible to manage memory size - and reduce memory demands as much as possible depending on the customization configurations; - otherwise, there is nothing the application could do to free up memory. + The rendering information of GUI is transferred to displays for conversions + from electronic signals to electromagnetic energy using electrons. + Light information is then transmitted to users' eyes for visual experience.

    - The application window could optionally reduce rendering frequency when the window - is not active. This includes the conditions when the window is hidden, minimized, - or occluded. The maximum number of frames per second (FPS) could be limited at 30 or - the half of the display refresh rate. At any time, the window cannot render properly - and should stop rendering when the window bounds are out of a single display area; - spanning the window to multiple displays is currently an unsupported behavior for - behavior correctness and validity. - This means functions such as SDL_GetDisplayForWindow are not supported. + Visual information is mostly based on screen coordinates and colors, + so the distinguishment of different GUI elements is by the differences in colors + and the relative positions of contents.

    +
    +
    +
    +

    - The application does not rely on system theme but its own textures and themes, - so the event of system theme change is not used; the application might use system - locale settings to identify which language to use when the game data directory is - first initialized, but the value would not change afterward, so the event of - system locale change is also not used. + When there is the GUI, AUI enhances User Experience (UX) by providing another + sense of experience to users.

    - Display events may be used only when the application is in fullscreen, but only - for the events related to the display the window is rendered on. - Otherwise, there is no need to take care of display information. - However, when the window is moved or resized, the display and window states must - be checked to see whether the window can still be rendered. - It is important to note that the controls of rendering are still handled in - the Kotlin interface, so the events must be forwarded to the Kotlin side. + The acoustic information of AUI is transferred to speakers for conversions + from electronic signals to kinetic energy of particles by vibrations. + Sound information is then transmitted to users' ears for aural experience.

    - When the window gained focus, it is assumed that all input device states - are all "zero", meaning that all states are cleared or reset to origin. - However, this must not always be true, so when the events of resetting states - are fired but the states were already "zero", the states remain unchanged, - but with warnings (recovered situations) emitted. - It is important to note that the set of states in SDL is not used due to this reason. - Also, when the window loses focus, all the input device states in the application - should be reset to origin and never get updated until the window regains focus. - When the cursor is out of the canvas's boundaries, the cursor is not regarded as - focussing on the canvas, and the mouse events are not processed until the cursor - enters the canvas's region again; the cursor becomes non-existed in the application - and hidden in canvas when the cursor is not in the canvas's region. + Auditory information is mostly based on frequencies and amplitudes of sound waves, + so different AUI contents are by the qualities of sound waves combined from + different audio sources.

    +
    +
    +
    +

    - Event timestamps are ignored. - When they are polled from the SDL event queue, it is expected that they are sorted - in chronological order, so the timestamps should not be enough to affect - the synchronization states while they are pumped and polled per frame update. + Occasionally, HUI exists to enhance the immersion of UX to users.

    - The modifier key states in SDL are not used generally. - For text typing, this should normally already be handled by the input editor, - which is processed in other events such as text editing or text input. - This includes key repeating, which is likely fired by the device or system. - The application itself internally holds a set of key states updated by - all the primary raw events, but not the keyboard states queried directly. - Also, the game is not playable with an external on-screen keyboard tool. + The haptic information from HUI is transferred from electronic signals + to kinetic energy of mechanical parts of tactile HIDs. + Users sense the movement of bodies made by the kinetic energy.

    - Scancodes of the keys would always be used internally, but the keycodes - may be used to display the hotkey settings in the application. - Therefore, the keycodes in the events are ignored, but for displaying - the key to users, a different way would be done. - Since scancodes do not change on keyboard layouts, the users have to - identify what layouts their keyboards are. - This cannot simply be done by SDL functions as some keys may be undistinguishable - (for keycodes) or not mapped against layouts between different keyboards (for scancodes). - A custom-built layout key mappings could be done to correctly map the keys - by user's choice, which is by default QWERTY. - It is still a question whether SDL_GetKeyFromScancode could be used - safely across different platforms and various devices. + Most likely, HUI is geared by motors, for example in mobile phones and + game controllers. The initiation would mostly be associated with + specific scenarios from other events or user interactions.

    +
    +
    +
    +

    - There would be two modes for mouse or cursor inputs. - By default, system cursor states are tracked and updated to cursor states, - which means the cursor motion states are updated each tick to correspond to - the cursor position retrieved from SDL. - When the game is in gameplay, the cursor mode turns to relative and wraps the cursor - in the window, most likely only at the center of the canvas. - In this mode, the mouse motion events from SDL are used instead of tracing - system cursor position. + Essentially, KUI motivates the feedbacks from users to the software or computer.

    - It is possible to select keyboard and mouse devices to use to exclusively only - accept the input events from just one device instead of all. - However, by default, all keyboard and mouse devices would be accepted to allow - further customization depending on this. - There is a possible concern about the application may become unusable when - the devices become unusable although other devices could still control the system, - so this feature might be reserved. + Users may initiate any interactions to the computer via inputs to HIDs. + Analog information detected by the HIDs is then transmitted to the computer + for analysis of interactions as electronic signals to the software or computer.

    - Joystick and gamepad update completion events (SDL_EVENT_JOYSTICK_UPDATE_COMPLETE - and SDL_EVENT_GAMEPAD_UPDATE_COMPLETE) are generally not used, - since the status updates are done through polling all the events in the certain - frame. - If any issue has been encountered, such as unideal non-synchronization scenarios - or delayed events, this should be reconsidered. + Most commonly, related HIDs include but not limited to keyboards, + pointing devices, motion sensors, trackers and controllers.

    -
    - -

    - The GMS would only be implemented in the Kotlin interface with abstractions. - Under GMS, all Graphical Model Objects (GMOs) serve as graphical wrappers of - the underlying objects they manage, such as data, states, even other GMOs. - The hierarchy of GMOs includes Screens, Menus and Components, - where one includes the next directly without skipping any stratum. - Rendering of them would be backed by the Engine's rendering system, - using the low level Kotlin bindings. - However, audio feedbacks triggered by GMOs cannot be managed by GMOs but - an audio interface manager out of this GMS. -

    -

    - The ScreenManager manages Screens stored in a linked list, with one - on the top covers the others underneath. - The Screen on the top also blocks user interactions via Graphical User Interface (GUI). - In the rendering system, the RenderSystem renders the Screens managed - by the ScreenManager onto the Canvas backed by the Engine. -

    -

    - Rendering of content is not affected by system's display scale; the application itself - would include a GUI scale setting to scale the sizes of Menus and Components. - However, this only affects the dimensions but not all the margins. - In the implementation, margins would most likely use percentages or ratios. - Still, the details of implementation about GUI scaling are still subject to confirmation. -

    -

    - It is complex to query the window safe area for rendering when it varies device by device. - Therefore, it should always assume that the safe area would generally be the canvas area - with paddings of 10 pixels, to make sure the rendering behavior is consistent enough - across platforms. - This only applied to important interactable or rendered content, general displayed content - like the background, miscellaneous margins, paddings and borders are not affected by this. -

    -

    - Most graphical elements should be based on textures and the default shaders used with textures. - Only if certain geometric shapes are really required, like dynamic or variable lines and curves - used to show interactions and associate interactive elements on the GUI, the restriction - to use only textures could be bypassed. - However, it is still recommended to use customizable properties like colors via resources. - In this case, the default shaders which take textures would not be used, but the shaders - that take other attributes. -

    -
    -
    -
    - -

    - The entire rendering framework is backed by OpenGL, so the details are mostly based on - the features and functionalities provided by OpenGL. - Basically, the framework aligns the rendering pipeline of OpenGL and general rendering - procedures, such as shaders and textures. -

    -

    - On GUI, the camera is fixed, so the view is also fixed and constant. - Therefore, in shaders, the view matrix and transformation are omitted and disregarded. - The projection should always be orthogonal and thus being a uniform value, while - the model transformation and color filter would be done per instance as uniform values - evaluated each frame depending on the specifics, but not per vertex. -

    -

    - There would be several sets of texture atlases implemented in the Engine. - Texture atlases should be used to enhance performance since there would mostly be - a large number of texture units used in the game; otherwise frequent texture bindings - which are likely expensive operations may occur. - In general, one might use - half pixel correction - to solve the issue, but the outcome may not be ideal or fit perfectly with - the rendering results due to the clipped outermost pixels in a tile as - a texture atlas element (later referred as "texel"). - Therefore, it is more recommended to have paddings or border pixels for each texel - in an atlas to keep the outermost pixels rendered well. - Even using "nearest neighbor" filters without paddings may not help solve the issue - completely because of potential numeric rounding errors. - All GUI textures would be managed with an atlas without any mipmapping, while all tile, - item and entity textures would be managed with different atlases with mipmapping. -

    -

    - Different textures would have different constrains resulting different configurations. - Since GUI textures are not supposed to be scaled down, but scaled up depending on - the scaling setting with the base layout set based on the lowest reference scaling, - mipmapping is not suitable for this. - On the other hand, the gameplay region may be scaled up or down, so mipmapping may be - applied to enhance the performance of rendering diminished textures. - Regardless of whether the atlas would be mipmapped, paddings would be done by repeating - the outermost pixels to the desired extent for each texel. - However, when the atlas is mipmapped, the padding size would be the number of pixels by - two to the power of the maximum mipmap level with the atlas, so paddings still take effect - for each mipmap. - When the atlas reached the hardware-limited size (known by GL_MAX_TEXTURE_SIZE), - a new atlas for the same configurations should be used, but stored together in a managed set. - Also, manual mipmapping for each texel is required to avoid edge bleeding during mipmapping; - computations of texture atlases would be performed and designed with reference to - - the rectangle packing algorithm by David Colson - or crates like rectangle-pack and crunch-rs; unless - the texels are explicitly constrained to only powers of two, this kind of - advanced algorithm would still be used. -

    -

    - In general, since the art style tends to be pixelated, mostly "nearest neighbor" - would be used to magnify the texels, but blending would be applied for diminishing to - simulate the similar colors. - At least, the mipmaps would be weighted using linear interpolation, but color filtering - could be either one of the available options through settings or launching options due - to rendering quality and computational complexity. - The maximum mipmapping level would be determined by the greatest power of two from - the common factors of all texel dimensions in the specific atlas, for which could be - further limited through settings or launching options. - Basically, there would still be the enforced constraint to the maximum value like four; - the dimensions of tile texels and that of entity texels must be a multiple of 8 and 2 - respectively. -

    -

    - Besides, the application would first enter the early loading stage to render the progress - by initializing minimal components required to show the progress bar and the loading screen. - So, the logo used in the loading screen would be embedded in the application, and other - components would be started loading and tracked after the initialization of the screen. - The loading screen may also show during loading of updated resource pack configurations. - Moreover, a short animated splashscreen could show before the loading screen. -

    -

    - Any GMO would internally hold an opaque handle managed by the Engine. - When a Screen is sent to the active stack, the handle in the Engine would be sent to - the internally held state, RenderingState, to be used during rendering. - The state internally holds an empty Mutex lock which controls which thread - to allow access to only one thread for thread-safety. - Ideally, only the GMS ticking thread and the main thread, which performs rendering, - would access the state either for modifications or reading for rendering. - The state impacts the thread for modifications only when the GMOs are within the state, - where the state would internally mark it during transitions. - This pipeline could greatly reduce JNI calls during rendering to improve performance - with only one call per loop as the general ideas. - Note that synchronization must be carefully examined for any data accessed in the Engine. -

    -

    - In general, rendering of any GMO should avoid reading framebuffer data drawn by - any previous GMO in the rendering loop. - This includes applying visual effects like color inverting or copying; - this is due to parallelization of shader computations in the render devices. - (See the related question.) - Various optimizations are also conducted, like reusing same shader programs as much - as possible when the previous draw call was also using the same program, eliminating - unnecessary OpenGL commands. -

    -
    -
    -
    +

    - As a general idea, the minimal support version is OpenGL 2.0 with some extensions required. - Although at the time this is written, most consumer devices in the last decade should - support OpenGL 3.0 and above, 2.0 is still targeted as the minimal version for most other - potential users. - However, when a newer version or usable extension is detected with the OpenGL context, - advanced features that may improve performance or even other effects might be available. + Most of the time, different UIs work together to complete the entire UX to users.

    - According to Wikipedia, the GLSL version in OpenGL 2.0 is 1.10, so version 110 would be used - as the base GLSL version to use for all the shaders used in the Engine. - If a more advanced GLSL version is supported while advanced features are used, the target - GLSL version should be specified and the shaders are applied only when the target version - is matched for both GL and GLSL versions. - This is said, some shaders might be used or available only when they are available provided - by the OpenGL context. + Especially for GUI frameworks, KUI typically plays the critical role to feed + user interactions back to GUI for further processing. GUI may also sometimes + communicate with AUI to play audio indicated by user interactions. + In some cases, GUI and AUI may be associated, especially when playing videos + with audio channels.

    -
    - - -
  • EFP 3
  • -
  • EFP 5
  • -
  • - Stack Exchange Game Development: "How to avoid texture bleeding in a texture atlas?" - answered by danijar -
  • -
  • - Stack Exchange Game Development: "How can I prevent seams from showing up on objects - using lower mipmap levels?" answered by Panda Pajama -
  • -
    -
    -
    diff --git a/efp/efp008/overview.svg b/efp/efp008/overview.svg new file mode 100644 index 0000000..e3c8d51 --- /dev/null +++ b/efp/efp008/overview.svg @@ -0,0 +1,57 @@ + + + + + + + + + + + MUI Framework Overview + Software + + + + + + + + + Display + Speakers + Motors + InputDevices + + HIDs + + + + + + + + GUI + + AUI + HUI + KUI + + + + + User + + + + + + + + LightEnergy + SoundEnergy + KeneticEnergy + Activities + + + From 5479d7b3a699a7b474ef0c7bcdbcc2c5c37b80df Mon Sep 17 00:00:00 2001 From: Ben Forge <74168521+BenCheung0422@users.noreply.github.com> Date: Wed, 25 Mar 2026 02:55:05 +0800 Subject: [PATCH 6/6] EFP 8: Add sections about UID & SRI --- efp/efp008/main.xml | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+) diff --git a/efp/efp008/main.xml b/efp/efp008/main.xml index 926f84b..126269a 100644 --- a/efp/efp008/main.xml +++ b/efp/efp008/main.xml @@ -18,6 +18,19 @@

    +
    + +

    + User Interface Devices (UIDs) are indispensable in this Framework. UIDs are certain classes of + HIDs, for what may refer to HID Usage Table for the exact classes to be included in this Framework. +

    +

    + In order to interact with UIDs via various UIs, Communication Interface standards are necessary. + The Communication Interface may generalize certain activities to a higher level where elements + may act consistently across different devices in the same class. +

    +
    +

    @@ -95,6 +108,16 @@

    +
    + +

    + Certain UIDs may not be interacted by the above UIs for those that are designed to output + user understandable states without using the above UIs. For such miscellaneous devices, + there may be no dedicated standardized protocols for communications, but likely generic + protocols for various types of devices. +

    +
    +