encryption – Passwords stored as obfuscated text, not encrypted

First thing I’d do is consider whether the system needs “reversibly encrypted” passwords at all (usually yes if it’s sending them on to some other service rather than just verifying them when a user logs in, sometimes yes if this is required by some important customer but they should have an option to properly hash them as well). Second, since you say “a simple mapping” I assume this isn’t actually using any modern cryptographic cipher primitives (AES, *fish, SALSA20, etc.), so that’s definitely a security bug you can file.

Look up a security contact (email address, etc.). There should be one somewhere on the site. If you can’t find one, try emailing security@company.domain, or just contact their support line and ask for a security contact.

Note that any form of reversible encryption, no matter how up-to-date its ciphers or strong its keys, suffers from a key storage problem: the program needs access to the key, which means anybody who can access the program itself can almost certainly decrypt the data. However, there are still improvements to be made from using real encryption:

  • Real encryption, even with a hardcoded key, will prevent anybody who doesn’t know the key from reversing the encryption if they get access to the DB. It sounds like they currently don’t even meet this – very low – bar.
  • Done correctly, the key should be unique per instance of the app. Getting access to somebody else’s DB shouldn’t reveal anything, even if you know the encryption key used by your own copy/instance of the software.
  • The key should be stored in a location as hard to access as possible. Ideally, it would be stored somewhere not actually extractable (like an HSM), with the app having the ability to request encryption and decryption of arbitrary strings but no other software allowed to access the HSM. At the very least the key needs to be separate from the DB, such that even an attacker with total, unfettered DB access can’t get the key without finding a new vulnerability in some other part of the system.

It sounds like you’re already well aware of why they should be using a slow password hashing function, rather than reversible encryption of any sort. Even if they need encryption for some passwords/API keys (stuff used to access external services, not to authenticate local users), they should use encryption for those secrets only, and use secure password hashing algorithms for user passwords.

If the vendor won’t budge – says that it’s not a security bug, or that they don’t care, or just refuses to respond – give them some time and then (IMO) it’s time to escalate. If possible for you, try to convince your company to threaten to cancel the contract; that’s often the simplest leverage. If you can’t, I would tend to move up to name and shame. Companies are usually way more likely to respond to things when it’s likely to impact their bottom line, and bad publicity can do that. Sites like https://plaintextoffenders.com/, or just reaching out publicly on social media (especially to, or at least mentioning, well-known security figures), can help get the word out.

Obviously that last part isn’t risk-free. There’s probably something in the terms of use about not “reverse engineering” the software, and although I think this level of “cryptanalysis” doesn’t count at all, I am not a lawyer. If you had to bypass any attempted safeguards to keep you out of the DB – entering a username/password of admin/admin might count, though copying a DB connection string out of a plain-text config file on a system you control does not – then that increases the risk they’d think it worthwhile to involve lawyers. A smart company wouldn’t do this – siccing the law on somebody who is trying to responsibly report a security issue is a good way to get the entire security community mad at you, and some of us hold grudges and make product recommendations at big companies (and others are hacktivists) – but a smart company wouldn’t let things get nearly that far to begin with. Before you take any steps beyond just reporting the issue to the vendor, especially if you have any notion of involving your company’s name, you might want to talk to the legal department. However, I am not a lawyer, and this is NOT legal advice.

encryption – Is client-side encrypted data really personal data

Scenario: My service that is storing customer files is hosted on my own personal physical server, “on-prem”. It is then using one of the popular cloud storage services (Azure blob storage, AWS S3) to store these customer files. They may or may not contain personal data.

Before the data is sent from my server to the cloud service the data is encrypted with my secret keys that are only ever stored on the on-prem-server.

Since I am using an encryption algorithm that is considered secure and the keys never goes to the cloud, would the data I send to Azure/AWS be considered personal data under GDPR? Would I for example have to include the storage service as a sub-processor in my published list of sub-processors?

authentication – Using an encrypted username in API calls

Your approach generally seems ok to me; it is conceptually similar to the well-used approach of issuing an encrypted JWT session token (aka “JWE”) which contains all the user’s metadata, but in encrypted so the user cannot read it.

One thing to consider is session fixation: once you issue one of these encrypted tokens, does that mean the user can continue to make API calls forever, or do you have a way to expire or invalidate the token and force the user to log in again? Common approaches are to put an expiry timestamp inside the token (typically ~15 mins – several hours after it was issued), or tie the token to a login session for example by embedding the sessionID inside the encrypted token and only accept the token for as long as that session is active.

I see in your fernet link that it stores the generation time in the ciphertext, and that its decrypt() has a time-to-live param, so I guess putting an expiry time on the tokens is trivial to do with fernet 🙂

 decrypt(token, ttl=None)

Backup and restore SQL Server database with encrypted columns: what should I backup along with database?

Assume I’ve backed up a SQL Server database with Management Studio, and that database have some encrypted columns. Now I want to restore this database into another fresh-enrolled MSSQL server. To use encrypted columns I had to:

  1. Create database master key;
  2. Create certificate;
  3. Create symmetric key.

Which of those are stored along with backup, and which I should backup manually and restore on the other server manually as well? I see there is SQL statements backup master key, restore master key, backup certificate, create certificate ... from file. So, along with database itself, there are:

  1. Master key;
  2. Master key password;
  3. Certificate;
  4. Certificate private key;
  5. Symmetric key.

So what (and how) should I deal with when I restore my database on a fresh server? Thank you!

Can Kaspersky’s default encrypted connection scanning exclusion list be modified by end user?

I am just wondering whether can I modify the Kaspersky’s default encrypted connection scanning exclusion list itself, without any sites being added to or removed from the "Trusted Address". I have included a screenshot of the list that I am talking about for your convenience.

enter image description here

encryption – I have a hex string and hex key, but I’m not sure how it was encrypted. It may be a puzzle

I found the following message written by a developer in a video game:

"key": "0xa6",

The “txt” string is obviously hexadecimal, and the key is only a single byte. I’ve tried the XOR cipher (and several others), but haven’t been able to get the bytes to decode to any legible string. I’ve tried all the common character encodings.

It’s a 91 byte array, and one interesting thing to note is that no byte is repeated more than 3 times, which leads me to believe it’s not a straight “byte to char” conversion and is most likely encrypted in some way (plus, I mean there’s a key).

As for the key itself, I’m unaware of many encryption algorithms that use 8 bit keys. I’ve tried using 0xA6 (166) as the ordinal position in the string, which curiously enough takes the final 16 characters of the message (c3adc9e98efb82ac) but I haven’t been able to find out what to do with that either.

PieShare.me – File Storage Service | Streaming & Download Platform | Encrypted & Anonymous | Affiliate Program | Proxies-free


Hello WJ community,

After a successful Image Hosting campaign, we have launched PieShare.me , a new File Hosting service with a partner program.

What We Offer:

  • 55% Initial Sale Affiliate Commission + 45% Rebill Commission with Our PPS Affiliate Program
  • 5% Reward of Your Each Website Sale
  • 1TB Storage for Registered Users & 3GB for Premium (Storage is expandable for Active Affiliate Partners)
  • 5GB Upload Size for Registered Users & 10GB Upload Size for Premium
  • Support themaCreator, Multi Drag & Drop and Remote Upload
  • Advanced Affiliate Statistic
  • 90 Day Inactive File Deletion for Free Users & Lifetime Storage without Deletion for Premium
  • High Speed & Anonymous Downloads and Streaming Video Playback for Premium
  • Recommended Videos and Screenshot Thumbnail Previews or Premium

Affiliate Rules:

  • One account per user, shared accounts or shared premiums are immediately suspended without warning
  • Payouts are Processed Using PayPal , Webmoney, BTC (BitCoin), (More option will be added in future)
  • Minimum payout amount: 20 USD, We pay every working day after a hold time period expire
  • We reserve the right to modify the Rewards program at any time without prior notice
  • Please read our Terms of Service for a more detailed overview of the rules
  • We appreciate feedback – please contact us with any queries.

We Do Allow Legal Adult & NSFW Files but We Strictly Prohibit Uploading Child Abusive & Other Illegal Material. Affiliate Partner Should Agree to Follow Our TOS otherwise We have to Suspend user from Our Service & Report to Legal Authorities.

WJ support: @kiboboss
Telegram: @pieshare

adb – Android device ro.crypto.state returns “encrypted”, but I can access the /data partition. How?

I have a OnePlus 3 with an unlocked bootloader and LineageOS. I forget the PIN and I am trying to recover access to it, and I have access to the recovery. The data is encrypted.

My understanding is that, if the device is using FBE, the value adb shell getprop ro.crypto.state will return encrypted, and the /data partition will be impossible to read.

Despite the fact that adb shell getprop ro.crypto.state returns encrypted, and despite the fact that I have not entered the PIN stored in the hardware keystore, I am able to browse and copy files from /data using adb shell.

So this seems to be in conflict: I should not be able to access /data if that returns encrypted. Is this true? If not, why is this not a conflict?

(I apologize if this has been asked before, I could not find any questions on this.)

debian – Recover encrypted LVM

I had Debian install running on LUKS encrypted LVM.

This morning, I wanted to install OS on external disk to be used on another host, but mistakenly I chose the wrong disk… Few seconds in I noticed that I have chosen the wrong one, so right away I stopped the process, but partition table already got written.

To recover I booted system rescue CD, and with the help of testdisk I recovered the partitions. However, I still couldn’t boot my old setup, so I started looking for a way to recover the files at least. I ran rescue disk again, and I tried to mount it, but it fails with the following:

# vgchange -v -ay worker1-vg

WARNING: Device /dev/mapper/recoveryx has size of 444214667 sectors which is smaller than corresponding PV size of 486313984 sectors. Was device resized?
WARNING: One or more devices used as PVs in VG worker1-vg have changed sizes.
Activating logical volume worker1-vg/root.
activation/volume_list configuration setting not defined: Checking only host tags for worker1-vg/root.
Creating worker1–vg-root
Loading table for worker1–vg-root (254:1).
device-mapper: reload ioctl on (254:1) failed: Invalid argument
Removing worker1–vg-root (254:1)
Activating logical volume worker1-vg/swap_1.
activation/volume_list configuration setting not defined: Checking only host tags for worker1-vg/swap_1.
Creating worker1–vg-swap_1
Loading table for worker1–vg-swap_1 (254:1).
device-mapper: reload ioctl on (254:1) failed: Invalid argument
Removing worker1–vg-swap_1 (254:1)
Activated 0 logical volumes in volume group worker1-vg.
0 logical volume(s) in volume group “worker1-vg” now active

Luckily the VG is detected, and I still have the LUKS header, but as can be seen in the above code blocks, activation fails due to “device-mapper: reload ioctl on (254:1) failed: Invalid argument”

Any ideas how to recover this partition?

encryption – Are there security vulnerabilities with apps/webpages/software while running and unloading from encrypted databases to memory for use?

Looking for an “in” into learning a bit more about security of software, apps and web pages from a standpoint of security while in operation. I don’t really know any terminology or situational vocabulary enough to search this properly for previous posts or other research so I’m hoping to be pointed in the right direction here :]

What I mean by “security while in operation” is this. When my app, software or webpage is running, i.e. being used and interacted with- open, what are the risks, if any of security? My context here is this. Im going to be working with encrypted databases to store my users data. Hive databases, and SQL with other encryption methods also. So it seems that for storage of data I’ve got it licked, seems pretty simple in that direction. BUT what about when software is running? Say I architect the app to unload an encrypted Hive box for a specific pack of data, it gets loaded into a standard List so it can be worked with. Well what are the security risks here if any? Would not considering this be a major security vulnerability?

Also in your answer can you please state the answer to the above as it pertains to each development platform i.e. web, app, standard computer software. That is, if there is any difference of course.

Further context:

I’m not designing anything top secret or anything, actually encryption may even be overkill but I sincerely respect my users privacy and data. Therefore I want to make sure I cover all bases and design with this in mind.

Thanks for any input :]