Lost .dat file though – Bitcoin Stack Exchange

I lost an old .dat file with 1.5 BTC credits.

I have created it with Bitcoin qt 0.8.6, I still have exactly the program on my SD card.

I once heard that every Bitcoin QT wallet contains a pool of about 100 addresses. Should not my lost address be in this program? Can I regenerate the address with this old qt? If not, what can I do? I do not think so …

procedural programming – Totally lost in the efficient use of wl packages

I try to learn that Mathematica Way of doing things, but in the process I've wl after wl packages created for each module and in between imported the same modules again and again …

Originally, I did the following: (A)

  • Given a folder, I have a wls script in which I For go through all the files it contains, import each file, and invoke a function / module f This will do a calculation of the input data from the file, along with other calculations I'm going to do directly (without calling modules).
  • Then, before the end of an iteration of the loop, I add all found results with some lists AppendTo and the loop continues (over the remaining files).
  • In the end, I save everything with me Export.

Then I put the name of all files in a list instead of the loop lsfilesI should just use / map the module f to the list, i. f/@lsfiles. But in fI access other libraries and modules, eg < and other custom with Get(.... .wl). But by doing it f/@lsfiles, Do I not import the required custom modules as often as I have filenames in? lsfiles? Unlike earlier with mine For Grinding, at least I have everything just once before the loop and never imported again.

(In fact, my attempt is to better understand how to perform nature's tasks of (A) in a way that is more consistent with Mathematica and dispenses with For loops. General recommendations are therefore very welcome.)

Airports – How can luggage be lost and never be recovered if it still has a luggage tag?

I lost a bag on a BKK -> ARN -> CPH flight two weeks ago. I immediately submitted a report to CPH when the bag had not arrived after more than an hour of waiting. Since then, no updates to the location of the bags have been made on the tracking website.

Theft is a possibility, but there seem to be many cases where luggage is lost "forever" and ends up on an auction.

As a layman, the only sensible explanation (besides theft) for the absence of the bag for more than a few days seems to be that the luggage tag has somehow fallen off the bag. If the luggage still has a sticker and is somewhere on an airport – be it BKK, ARN or CPH – would it have noticed someone and scanned the sticker?

However, when you read about why luggage is lost, and especially look at photos of lost luggage at auctions, it seems quite possible that a bag has its tag with barcodes intact and yet "missing".

How can that be? Surely the system must be designed so that a baggage handler can scan the luggage tag at any time and find either the missing baggage report or the passenger's contact information? Or is the identification system for baggage handling much more boring than I imagine? I read somewhere that "after 3 days the barcode is reused for other passengers' luggage", which sounds too silly to be true.

It would be interesting to know how it works by someone who has insight into how the system works in real life.

innodb – mysql – import of large tablespace: lost connection during query

I am trying to recover Innodb table, which has 1.5 MB lines from ibd file (5.5 GB)

These are the exact steps I take:

  1. Retrieve the query to create a table with the mysqlfrm command

  2. Create the table

  3. Alter Table discards the tablespace

  4. Move the new tablespace to the DB directory

  5. Age table import tablespace;

and I get this error after 5 minutes: –

ERROR 2013 (HY000): No connection to the MySQL server during the query
my.cnf:

(client)
port=3307
(mysql)
no-beep

(mysqld)
max_allowed_packet=8M
innodb_buffer_pool_size=511M
innodb_log_file_size=500M
innodb_log_buffer_size=800M
net_read_timeout=600
net_write_timeout=600
open_files_limit=100000
skip-grant-tables
port=3307
datadir=D:dbrecover_home_db_homedb
default-storage-engine=INNODB
sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
log-output=FILE
general-log=0
general_log_file="WIN-36LFCDISVVC.log"
slow-query-log=1
slow_query_log_file="WIN-36LFCDISVVC-slow.log"
long_query_time=10
log-error="WIN-36LFCDISVVC.err"
relay_log="WIN-36LFCDISVVC-relay"
server-id=1
report_port=3307
lower_case_table_names=2
secure-file-priv="C:/ProgramData/MySQL/MySQL Server 5.7/Uploads"
max_connections=151
table_open_cache=2000
tmp_table_size=123M
thread_cache_size=10
myisam_max_sort_file_size=100G
myisam_sort_buffer_size=236M
key_buffer_size=8M
read_buffer_size=64K
read_rnd_buffer_size=256K
innodb_flush_log_at_trx_commit=1
innodb_thread_concurrency=9
innodb_autoextend_increment=64
innodb_buffer_pool_instances=8
innodb_concurrency_tickets=5000
innodb_old_blocks_time=1000
innodb_open_files=300
innodb_stats_on_metadata=0
innodb_file_per_table=1
innodb_checksum_algorithm=0
back_log=80
flush_time=0
join_buffer_size=256K
max_connect_errors=100
sort_buffer_size=256K
table_definition_cache=1400
binlog_row_event_max_size=8K
sync_master_info=10000
sync_relay_log=10000
sync_relay_log_info=10000

Is there a way to import it?