Reducing unsatisfactory navigation states without losing consistency

welcome to UX StackExchange!

One of my favorite “decision helpers” in design is the Principle of Least Surprise: try to avoid system behavior that your users don’t expect. (Yes, sometimes this is pretty straight-forward, and at other times it’ll take lots of usability testing to find out what, exactly, users expect. 😉 )

You already noted that options 2 and 3 violate that principle.

In contrast, option 1 is very common pattern that can be generalized like so: allow users to navigate to an empty container (a folder, a page, a tab view, etc.), but clearly indicate that it is empty.

To help them decide whether a container is worth opening, you can display the number of contained items in the context of the navigation control that takes users to that container.

Here’s a simple example from Apple Mail: The selected Inbox folder is empty, and the main content area displays an explicit notice. Other folders display the number of unread messages they contain.

enter image description here

Applied to your design problem, you could implement option 1 and add the number of available analytics graphs to the sub-navigation tab labels. Something like this:

enter image description here

Mike M’s approach is another good option, but I just find it that little bit more restrictive.

navigation – Reducing the ‘Gulf of Execution’ without losing consistency

I’m working on a fairly large analytics dashboard. One challenge that is occurring repeatedly is how to always display relevant information without losing consistency.

We have a set of items that can be analyzed. Selecting and changing those is always possible. It’s the top-most navigational element (think: drop-down in the top navigation bar).
We have a set of analytic. While many of those analysis are shared between different layers, some are not.

Here’s the problem:
Say we have two analyzable items A and B.
For A we provide analytics X and Y.
For B we provide analytics X, Y and Z.

What happens, when a user currently is on page Z for layer B and changes the selected layer to A. We came up with three options:

  1. We stay on page Z and say “No analytics Z for layer A”. The navigation stays consistent throughout the dashboard. But there a some “invalid” states and may annoy the user (Norman calls this a “Gulf of Execution”).

  2. We jump to another analytic page that’s available for A (but which one?). Inconsistent navigation, with unexpected jumps, but all states are always valid.

  3. We stay on page Z, say “No analytics Z for layer A”, allow to switch pages, but as soon as the user left page Z we disable/hide it.
    Navigation stays somewhat consistent, no jumps, no invalid state. But feels strange that user cannot go back to where he/she came from.

Any other/better ideas?

C# UDP Multiplayer Networking – issue with choppiness and consistency

I know I’ve done this incorrectly, but I don’t know what the right way is.

Right now I have two clients that connect to a server. Each tick the players send a packet of information containing their X,Y to the server. The server takes those packets and sends them to the other clients. So it’s like this:

EXAMPLE:
Player1: 1:x,y or 1:100,125
Player2 receives that information, and then moves that instance of a NetPlayer based on the ID (1)

When the NetPlayer receives a new position packet from the server, it calls MovePlayerTo(Vector2 _newPosition);

public void MovePlayerTo(Vector2 _newPosition) {
            this.newPosition = _newPosition;
            //OldPacketPosition = newPosition;
            //newPosition = _newPosition
            //LerpFromOldToNew(OldPacketPosition, newPosition); 
}  

Then, in the NetPlayer.Draw method (called 60 times a second), the Position is set like so:

 Position.X = MathHelper.Lerp(newPosition.X, Position.X, lerpValue);
 Position.Y = MathHelper.Lerp(newPosition.Y, Position.Y, lerpValue);

This has smoothed the player out pretty well, but it still appears as if it’s not being drawn as smooth as a local player, even with interpolation. It seems like it’s being drawn at 45FPS. If I lower my client tick delay to 200ms, it lerps very quickly between those positions in short bursts. It doesn’t actually make the entire movement smooth.

I’ve tried changing the MovePlayerTo() to set an OldPacketPosition, and NewPacketPosition, and then lerp between the last packet.position and the new one, and it becomes even choppier, no matter what I set my (float) lerpValue to.

Also I noticed if I jump up and down a bunch, it doesn’t show the NetPlayer hitting the ground. He moves smoothly and kinda bounces before he even hits the ground and shoots back up in the jump animation.

I know I’m doing this improperly. I’ve seen other people set their tick rate to be really low, and the positions aren’t super accurate but the movement is still smooth between packets.

postgresql – How can data consistency be guaranteed across multiple reads?

Let us imagine that my application has to perform a series of SELECTs (for different tables) in succession to collect different information from the database and we do not want any of these tables to change as we collect the data.
With postgresql isolation levels "repeatable reading" or "serializable" isolation levels can be used. However, if another transaction commits changes to a table that we have not yet referenced, they will still appear even if our transaction has started as the following sequence of actions indicates:

T1: BEGIN ISOLATION LEVEL SERIALIZABLE;    # imagine here table t has 10 rows
T2: INSERT INTO t VALUES(1, 2, 3);
T1: SELECT COUNT(*) FROM t;                # we'll see 11 lines for the rest of the transaction

However, if T1 accessed t before T2 did the insertion, 10 lines are displayed for the duration of the entire transaction:

T1: BEGIN ISOLATION LEVEL SERIALIZABLE;    # imagine here table t has 10 rows
T1: SELECT COUNT(*) FROM t;                # we'll see 10 lines for the rest of the transaction
T2: INSERT INTO t VALUES(1, 2, 3);
T1: SELECT COUNT(*) FROM t;                # still sees 10 lines etc.

With the above behavior, if we need to access multiple tables during the transaction, many of them can change in the time interval between the start of the transaction and the moment we access it.
I understand that isolation levels should work this way, so no explanation is needed here.

But is there a way to create a kind of "snapshot" after a certain point in time? Are explicit locks required in this case?

Set theory – how do you understand the interface between consistency strength hierarchy, reverse mathematics and proof-theoretical order analysis?

I am aware of three important "hierarchies" of mathematical theories, but I do not know how to relate these hierarchies. Here are the hierarchies I'm thinking of:

  1. Strength of consistency. My understanding is that you are looking at (theories recursively?) Theories $ T $ arithmetic (or which interprets the language of first-order arithmetic) of sufficient strength to establish a scheme for the syntax of first-order languages. You arrange these theories (in part) by saying that $ T> T & # 39; $ if $ T $ proves the consistency of $ T & # 39; $ (when $ T & # 39; $ is encoded according to the syntactic scheme mentioned above).

  2. Reverse math. My understanding is that you are looking at theories here $ T $ second-order arithmetic (or those interpreting the language of second-order arithmetic) and (partly) orders them directly according to their implications.

  3. Evidence-theoretical order analysis. My understanding is that you are looking at theories here $ T $ arithmetic (or which interpret the language of first order arithmetic) and orders them according to theirs proving theoretical atomic numberthe highest of all (countable) ordinal numbers $ alpha $ so that there is a relationship $ R subseteq mathbb N times mathbb N $ definable in $ T $ so that $ T $ prove it $ R $ is a good order (although the sense in which $ T $ can even express The $ R $ is a good order, though $ T $ is first order is something I don't quite understand), and $ R $ is (externally) isomorphic to $ alpha $.

Ask:

  1. Where do the areas of application of these hierarchies overlap?

For example, the "strongest" theories (such as ZFC + large cardinals) usually seem to be examined for consistency, as opposed to reverse math or proof theory. I have the impression that proof-theoretical ordinal numbers are most often used for relatively weak theories, and that the reverse math is somewhere in the middle. But I'm not even sure where to look for overlaps in these areas, also because the types of theories that are considered in each hierarchy are slightly different.

  1. How are these hierarchies related when domains overlap?

In general, I imagine that there are no direct implications saying that one of these sub-orders refines one of the others (even if their areas of application coincide). But I imagine there are some general tendencies – a stronger theory in one hierarchy should probably be stronger in another hierarchy as well.

  1. Should I really see these three hierarchies as "comparable" in the sense that they convey an idea of ​​the "strength" of a theory? And are there other hierarchies that I should also consider in this regard?

White Balance – Color consistency between RAW images for time lapse in darktable

Why do I get different colors if I apply exactly the same gradient to all images in Darktable? I understood that the camera's white balance setting doesn't matter for a RAW file, and that I would achieve some color consistency if I edited and applied the same white balance to two different images of the same scene.

Then why do I get that instead? And how can I determine the color consistency between these images?

Enter the image description here

Files in .ARW format were recorded with Sony a6000.

Do replicated, distributed multi-primary systems ensure sequential consistency?

I know that replicated, distributed primary backup systems ensure sequential consistency. My question is whether multi-primary systems can achieve this. I mean, if you use Consencus (i.e., Paxos algorithm) to arrange an order for the requests received, sequential consistency is likely to be achieved. However, if you use data types that are replicated without conflict, sequential consistency is achieved.

dnd 5e – What is a shadow (monster) made in terms of consistency?

It is amorphous

Dictionary definition:
a · mor · phous (adjective); əmôrfəs
without a clearly defined shape or form

That means its substance and consistency can be changed. This is similar to the unique nature of the Mimic, but it is not the same: it is also amorphous, but it is neither undead nor immaterial. For example, if the shadow were or could become immaterial, it would be able to move through things like a ghost.

Physical movement. The ghost can move through other creatures and objects as if they were difficult terrain. It suffers 5 (1d10) power damage when it ends its turn in an object. {A spirit and a spirit also have this function}

Let's look at some important features

SHADOW, middle undead, chaotic evil
Stealth +4 (+6 in low light or dark)
vulnerability to damage bright
damage resistance Acid, cold, fire, lightning, thunder; Bludgeoning, piercing and slashing non-magical attacks
Damage immunity necrotic, toxic
Condition immunities Exhaustion, fear, bound, paralyzed, petrifiedpoisoned fragile, reserved

Note that worldly weapons can damage it, so it has something Substance to it.
What can the shadow do if it is amorphous?

Amorphous. The shadow can move through a room that is only 1 inch wide without squeezing it.

Vampires in the form of mists and water elementals can do something similar, and yet everyone has a different consistency.

  • I like to describe the consistency of a shadow as "like thick smoke"
    that you get the idea of ​​how it would feel, but that it is necessary
    to flow things (like walls) instead of flowing through things
    (the way a ghost can).
  • You could also call its consistency "dense fog" and approach it
    from what it seems to be.

    The MM leaves the exact description to the DM.

Shadow Stealth. The shadow may last in low light or darkness
the Hide action as a bonus action.
Sunlight weakness. In sunlight, the shadow adversely affects attack rolls, ability checks, and rescue rolls.

The last and somewhat unsatisfactory answer is that a shadow is almost immaterial, but since this term is not used to describe it, like other monsters, it is "almost unfounded" so you can't get a grip on it, knock You hold it down or otherwise.

Consistency Heirs and Godel's Theorem

Consider the infinite sequence of theories:

  1. $ T_0 $ = $ T $ = the basic theory
  2. $ T_ {n + 1} $ = $ T_n $ + "$ T_n $ is consistent "

Define $ T_ omega = cup ^ { infty} _ {i = 0} T_i $,

  1. is $ T_ omega $ guaranteed inconsistent for $ T $ can interpret that arithmetically?
  2. If the answer to 1 is "no", what about theory? $ T_L = cup_iT_i $ for all atomic numbers $ i $?