Unity – Move gameObject in his direction

I have the following four gameObjects, each with the rotations: -90, 0, 90, 180. I want them to move as shown in the picture below

Enter the image description here


I've gotten so far:

public class Projecttile : MonoBehaviour
{
    public float speed;

    private void Update()
    {
        transform.Translate(transform.up * speed * Time.deltaTime);
    }
}

The result of my code is the following: Enter the image description here

Why does my code lead to this result and how can I move each gameObject in the direction it is pointing?

Unity – How can I stop jumping when a 3D character is in the air (double jump)?

I am working on simple RPGs and have problems with player jumping. I try to use Unity's raycast to check collider under the player before jumping again. The problem seems that no matter how many changes I make to the removal or validation of the canJump function, the player's raycast appears to be twice at maximum jump. At lower levels, the player can even hover if the timing is correct. Could someone overlook my code as it seems that I have a noob error somewhere?

using System.Collections;

using System.Collections.Generic;
with UnityEngine;

public class player: MonoBehaviour
{

public float speed;
public float jumpForce;
public float turningSpeed;

private bool canJump;


void Start()
{

}


void Update()
{

    RaycastHit hit;
    if (Physics.Raycast(transform.position, Vector3.down, out hit, 1.01f))
    {
        canJump = true;
    }

    ProcessInput();

}


void ProcessInput()
{
    if (Input.GetKey("right") || Input.GetKey("d"))
    {
        transform.position += Vector3.right * speed * Time.deltaTime;

    }

    if (Input.GetKey("left") || Input.GetKey("a"))
    {
        transform.position += Vector3.left * speed * Time.deltaTime;

    }

    if (Input.GetKey("up") || Input.GetKey("w"))
    {
        transform.position += Vector3.forward * speed * Time.deltaTime;
    }

    if (Input.GetKey("down") || Input.GetKey("s"))
    {
        transform.position += Vector3.back * speed * Time.deltaTime;
    }

    if (Input.GetKeyDown("space") && canJump)
    {
        canJump = false;
        GetComponent().AddForce(0, jumpForce, 0);
    }


}

}}

C # – Some auto-completions don't work in Visual Studio code with Unity

I'm using Unity 2018 with Visual Studio Code version 1.43.2 on a Mac.


Everything seemed to work fine between Unity and Visual Studio code, but now I realize that something is wrong with code. I followed along with a tutorial and the autocompletion functionality of code seemed to be fine to function OnTriggerEnterwhat was shown as autocompleting for the guy in the video (who used that other Visual Studio) was not completed. However, many other Unity things are automatically shown as options.

I've looked at this Microsoft VS Code and Unity help page and have everything I think is necessary to combine them. I have .NET, Mono, Debugger for Unity, the C # extension, and I have Unity Tools for a good measure.

Generic C # things like return, void, intetc. automatically complete so that the C # extension appears to be in order. Many things of unity, like Vector3, GameObject, Destroy, transformetc. also automatically complete as they should.

What is wrong here? How can I fix this and get full autocomplete?

Unity Profiler – How do I fix audio (WASAPI) feeder issues?

tl; dr: The audio (WASAPI) feeder does nothing; it just sits idle. Something else uses the time and the spoiler alarm: it's the GPU.

Here are the steps:

  1. Make sure that GPU profiling is enabled
  2. If the error "GPU profiling is not supported by the graphics card driver" is displayed during GPU profiling, activate "Graphics jobs (experimental)" in the player settings (details later).
  3. Run it again with GPU profiling enabled.
  4. Note that rendering in GPU Profiler takes all the time.

I am developing on a laptop with a mobile version of an nVidia card, which may be the reason for the GPU profiling error, and then I guess the error is the reason why the GPU was not listed on the other profilers.

I noticed that "Others" was all the time in the GPU, but later I realized that "Others" is the editor because I used the scene window to move around the scene. When I clicked the game window and moved the player camera through the scene, the GPU rendering changed from "Other" to "Opaque".

Finally, here is the setting for graphic jobs (experimental):

Location for graphic jobs (experimental)

Unity – Let the enemy intelligently follow the player

I tried to create a script that my enemy uses to intelligently follow my player (but not too much) and I want to know if there is a better way (of course there is one, but I wanted to find a way myself).
I did the following:
I wanted my enemy to recognize the player as they approach a certain distance from them. I have a varibale to determine how far the enemy has to travel (if the player is too close) and when to follow the player (player too far). I didn't want the enemy to forget that there is an intruder if the player goes around a corner and can't see him. If the enemy sees the player directly (Raycast), he turns towards him and follows him directly by normalizing the difference of his 2 positions. Now is the hard part that I'm not sure about. My player has a list that contains my player's last 1000 positions and is updated every 0.01 seconds to add the last position and remove the oldest. If the enemy cannot see the player directly (! Raycast), he goes through all positions in the list by searching for each Vector3:

1-is there anything between the enemy and this position?

2-is there anything between this position and the player?

Then it adds the distance between the enemy and the point and the distance between the point and the player and checks

3-Is the path length shorter than the last shortest path at this point?
If it is smaller, it will be assigned to the best point.

If it finds a point that meets these conditions, it continues and if it doesn't see the player, it continues to look for new points.
If no point meets the condition, he simply stops following the player (the enemy has lost him).
All of this works with raycasting to check positions, the presence of objects between enemy, player and points as well as the way.

Unity – Reuse assets in different rendering pipelines

What's the best way to work on the same project with two rendering pipelines that share 99% of the assets?

We have a really big project that we decided to use both the URP and HDRP pipelines in Unity. Because of the way our application is set up, our only differences between the two pipelines from the asset folder perspective are just 2-3 materials and another 2-3 shaders.

Our scripts are not important. Most of the code is in DLLs and is automatically output to the correct location.

The best thing we found was to create a directory connection between two folders in our assets, and now the structure looks like this.

HDRP
└───Assets
    └───Custom
    |    |
    |    └─ Materials
    |          Material1.mat
    |          Material1.mat.meta
    |           ...
    └───Shared
         |
         └─ Materials
               Material1.mat
               Material1.mat.meta
               ...
URP
└───Assets
    └───Custom
    |    |
    |    └─ Materials
    |          Material1.mat
    |          Material1.mat.meta
    |           ...
    └───Shared
         |
         └─ Materials
               Material1.mat
               Material1.mat.meta
               ...

The cross was made for them Shared Folders that are literally 99% percent or our assets. The problem with this approach is that we have difficulty overwriting the shared assets with the custom ones (see my other question). I would like to know if our approach is wrong and if so, why? and what would be better.

Unity 3D: Missing button "Edit joint angle limits" in the drawing connection

So I followed one of Brackey's tutorials on Ragdoll physics, and part of the tutorial shows how he clicks a button at the top of the "Compound Connection" component labeled "Edit Joint Limitation" but in my drawing connection. There is no button that says that. Has it been removed? Or where do I find it?

Unity – Unity2D: Collision with edges problem

I use Unity (in 2D).

I applied force to the rigid body 2D when the player moved left / right. For some reason, when the player hits the edges of the tiles, he simply pushes himself up and forward.

The player has a circular collider for his body to prevent it from sticking to the tiles. The standard material for physics is a 0-bouncy / 0-friction material so that they do not stick to the walls.

The player doesn't jump. The jump is key-bound, and the debug shows that I am not jumping.

Does anyone know what happens / what causes this sudden use of force? Any tips on how to fix the problem?

(This is a crosspost to reddit)

Problem visualized

Unity: Under iOS, materials with normal cards look completely wrong

We are creating an AR-enabled app with Vuforia and Unity 2019.3 using the Universal Render Pipeline.
The problem is that our models on iOS, whose materials contain normal maps, look completely bad, as you can see. Image of bad graphics

When we remove the normal cards, the models look good, but since the normal cards add a lot of detail and depth to the scene, we'd rather not drop them.
We tried to change the normal map import settings, compression formats, etc., but it didn't make any difference.

Is there any other solution to fix this?

By the way, there is no problem on Android.