phishing – Should big companies publicly list their legit emails and links?

I’ve been going through a hassle with a big mobile company about if I should click into a link in an email that I think is from them. The email and the link inside is not obviously from the company, as the domains do not match up to the providers domain, or alternate domains. It seems through discussion on the company forum that it’s not clear that this email is legit. This has been a confusing situation.

Should large companies have a security standards page, wherein they list all legit emails and domains that belong or are associated with them? For someone like me who checks the sender email and the domain of the links within it, I want to know who they are. If the company listed them on their website I would feel instantly safe. But instead I’ve wasted hours trying to resolve this situation without an answer yet.

My question – does such a scheme exist to list legit emails/domains on company websites? If not, why? If so, where?

web application – Should Maintenance Informations Such As “Service Will Be Unavailable At X And Should Be Back At Y” Be Publicly Available?

Suppose you are implementing a maintenance info banner that is publicly shown on your web application, meaning that even unauthenticated/unauthorised users can access this information easily.

Let’s assume that this banner won’t contain any pieces of information regarding technical implementations, versions of third party software, whatsoever, just plain information that it is going to be maintained at a specific point of time and order to do so it will not be available and maybe when it will be coming up again.

What are the security implications on this? Is this useful information for adversaries in order to time attacks on “sensitive” timings such as shutdown/startup of the web application? Are there any other considerations that should be made?

rest – APIs Security for publicly exposed APis for Website/App

We have a website using PWA Client calls / Mobile APP, all using the same APIs.
We have APIs Exposed to Public. Currently, our APIs are not secure meaning that anyone can check APIs signature via developer tools/proxy tools and hit the API.

We want our APIs to be hit by verified clients.
Verified clients do not mean logged in clients. Our website can be used by non-logged in users as well. Clients here mean users accessing the website via browsers/app.

So for that, we are planning that we will allow those APIs only which have the registered/enabled token and that will be sent via header.

Now to generate the token:

  1. Device —- sends Token(B) Request—-> Server
  2. Server generates Token(B) and returns it and stores it in Redis
  3. Device —- sends Token(B) to enable request—–> Server
  4. The server enables it
  5. The device sends Token(B) in all subsequent requests
  6. The server checks whether the token exists in Redis in the enabled state or not

Since these register/enable token APIs are also exposed publicly, to ensure no one is able to hack this process:

  • While enabling the token, we also send the Encrypted token(A) alone with the actual token(B).
  • At the server, we decrypt the token(A) and matches it with the normal Token(B).

Encryption is done using the private key known to client/Server only.

Is this the right approach and this is vulnerable? The only issue is seen is that register/enable token APIs are exposed publically. But we have also added the security to that, is that good enough?

What is the ideal public configuration file nxt.properties for a publicly accessible node for NXT / Ardor?

Below is what I use for a public node for Ardor. Some of the lines are commented out. You can activate it if you wish.

# Can also specify networks in CIDR notation, e.g. 192.168.1.0/24.
#nxt.allowedBotHosts=127.0.0.1; localhost; [0:0:0:0:0:0:0:1];
nxt.allowedBotHosts=*; localhost; [0:0:0:0:0:0:0:1];


# Host interface on which to listen for http/json API request, default localhost only.
# Set to 0.0.0.0 to allow the API server to accept requests from all network interfaces.
#nxt.apiServerHost=127.0.0.1
nxt.apiServerHost=0.0.0.0


# Hosts from which to allow NRS user interface requests, if enabled. Set to * to allow all.
#nxt.allowedUserHosts=127.0.0.1; localhost; [0:0:0:0:0:0:0:1];
nxt.allowedUserHosts=*; localhost; [0:0:0:0:0:0:0:1];


# Host interface for NRS user interface server, default localhost only.
# Set to 0.0.0.0 to allow the UI to be accessed on all network interfaces.
#nxt.uiServerHost=127.0.0.1
nxt.uiServerHost=0.0.0.0


nxt.isTestnet=false


nxt.apiServerCORS=true
nxt.uiServerCORS=true

#nxt.apiServerPort=27876


#nxt.apiServerSSLPort=27876
nxt.maxPrunableLifetime=-1
nxt.includeExpiredPrunable=true
nxt.adminPassword=SET_YOUR_OWN

# The default account is used, to automatically login to the wallet during startup
nxt.defaultDesktopAccount=ASDFASDF

#nxt.uiSSL=true
#nxt.apiSSL=true

[ Politics ] Unanswered question: Trump clashes publicly with the top doctor over coronavirus drugs: if he is re-elected, will he get us to put Gatorade on the plants?

[Politics] Unanswered question: Trump clashes publicly with the top doctor over coronavirus drugs: If he is re-elected, will he get us to put Gatorade on the plants?

where on earth – what is the largest publicly accessible air-conditioned room in the world?

I will start with one:

The Flower Dome in Singapore's Gardens by the Bay is the largest greenhouse in the world listed in the 2015 Guinness Book of Records .000 195,000 Room kept at 23-25 ​​° C all year round. For comparison: the ambient temperatures in Singapore fluctuate around 30 ° C.

Command line – Use wget to download Google Drive files without making the link of the file publicly available

Make the file public first and then:
I use the download for small files

wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME

for large files

wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=((0-9A-Za-z_)+).*/1n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt

But how does that work without creating a publicly available link?

Note: wget is not a must for me. If you know any other software, recommend it to me, but it must contain the following functions: –

  1. Command line based
  2. Function to download the file résumé
  3. light and portable (if possible, not mandatory)
  4. Multithreaded download and function for attaching files (if possible, not mandatory)