Small secret of big mining: inexpensive FPGA for getting ...

GPU Mining Crash Course - START HERE!

Welcome All to the GPUMining Crash Course!
With the increase in prices in cryptocurrency, a lot of people are getting back into mining and a lot of people are brand new to the concept overall. So, I quickly wrote this crash course to help you understand what to expect and how to successfully mine your first cryptocurrency. This crash course isn't gonna have all of the fluff you'd see in a normal publication. This is just everything you need to know to get up and running on your first cryptocurrency mining rig.

What is cryptocurrency mining?

One of the main things about cryptocurrencies is that they are "decentralized". Sounds great, but WTF does that even mean? Well, the easiest way to explain it is...
You know how if you want to send your friend/family money digitally, you can do so through your bank. Your bank likely takes a transaction fee and in a few days they will transfer the money. Since cryptocurrencies are decentralized, they don't have a bank or organization to fulfill the transfer of money. Instead, they outsource the computing power of their cryptocurrency network to miners (soon to be you). These miners are verifying transactions, securing the blockchain, and powering the cryptocurrency's specific network among other things. As an incentive, the miners collect transaction fees on the transactions that they verify and collect block rewards while new currency is still being introduced into the ecosystem.

What kind of rig should I build?

You can mine cryptocurrencies using your CPU, GPU, FPGA, or ASIC, but this is a GPU Mining subreddit, so I will cater this to GPUs.
For building a great all-around GPU rig, there are two models of GPUs that I'd recommend:
Both of these GPUs have solid hashrates across most mining algorithms and for a decent price! You should be able to find both of these kinds of GPUs used for around $200-$250 each, which is a great price if you know what happened during the last mining craze! ($200 GPUs were out of stock everywhere and people were reselling them for $600+ each)
There are also plenty of great AMD GPUs for mining, but I've worked mostly with Nvidia so that's why both of my recommendations are Nvidia and not AMD.
Other parts to your rig that you'll need are listed below. Most of these can be pieces of crap and are just needed to make the rig actually run, but the one spot you DON'T want to cheap out on is the power supply unit. A decent power supply unit will keep your home from burning down while also keeping your rigs up and running smoothly. Here are my recommendations:

She's built, now what?

Now you need to do a few things. I am a Windows miner, so I will be speaking to Windows here:
  1. Update Windows - Do all of the updates. Just do it.
  2. Update Drivers - Go to the EVGA website and download GeForce experience. It will keep your GPU drivers up to date.
  3. Go to Windows Device Manager and make sure all of your GPUs show up under "Display Adapters". If it is there, but it isn't showing the Name/Model of the GPU as the name, right click it and select "Update Driver". This should fix it.
Assuming you've done all of this, you're ready to download a mining application.

Mining Software

There are tons to choose from! Claymore, Phoenix, EWBF, LolMiner, etc... It can be overwhelming pretty quickly since they all have different algorithm support, speeds, efficiencies, and a whole lot more. On top of that, in order to get them running you need to set up batch files to call the proper exe, point you to the correct pool, and a whole bunch of other stuff that can be confusing to a new user. Not to mention, you will probably need a separate miner, config file, batch file, etc. for each different algorithm that you're interested in mining on.
Instead, I recommend that you download a miner management software that will take care of most of this tedious work for you. There are a few in the sidebar, but the /GPUMining favorite is AIOMiner. It was developed by our very own community member, xixspiderxix with the intention of making mining as easy as possible to do and without any fees. It supports over 100 different algorithms, so you'll be able to mine nearly ANY cryptocurrency you'd like. Just download it from their website and it will take you through a quick tutorial to help you get set up! You can also connect your rig to their website for remote monitoring and control. You've probably seen a few of their posts around this subreddit.
Other Windows mining softwares include:
Note: Many mining softwares have fees built into them. Most are around 1%, but can go as high as 5% or greater! You want a mining software with little or no fees at all so that you get to keep as much cryptocurrency as possible. These fees aren't something you actively pay, the software will automatically take it by mining on the developers behalf for a given amount of time and then switching back to mining on your own behalf. So, please be diligent in the software that you evaluate and make sure it is reputable.

I keep hearing about NiceHash. What is that?

The asshole of the mining industry. Jk, but not really.
NiceHash is a software program that allows you to sell your rig's hashing power to someone on their marketplace. They market themselves as profitable mining, but you're not really mining. You're selling your power in exchange for Bitcoin.
They did a great job telling people that with them, you're always mining the most profitable coin, but that's just not true. Since it is a mining marketplace, they make you mine whatever their most expensive contract is. If their contracts are below market prices, then you're not operating as efficiently and profitably as you could be.
NiceHash also has a sketchy history, which continues to this day. In 2017, they were hacked and lost $65M worth of Bitcoin. No one got paid out for MONTHS and many of their executives conveniently resigned. Their platform is also used to destroy cryptocurrencies. Since people are able to purchase mining power on their platform, people have used their platform to purchase enough mining power to control individual cryptocurrencies and duplicate coins, which increased the malicious user's wealth while completely destroying the integrity of the coin's blockchain. HoriZEN (formerly ZenCash), Ethereum Classic, and many other great cryptocurrencies have been the victim of NiceHash's platform.
For this and many other reasons, we highly recommend that you stay AWAY from Nicehash. We understand that it is extremely easy to use and you get paid in bitcoin, but they are destroying the industry with their greed and lack of motivation to change their platform for the protection of cryptocurrencies.

Concluding Thoughts

This is pretty much everything you need to know to get started. We covered the hardware, setting up the software, which software to use, and AIOMiner's tutorial will get you up to speed on how to actually mine the cryptocurrency that you want better than I can explain it, so I'll leave that part to them.
If you have any questions on this crash course, please leave a comment below where myself and other community members will be able to help you out.
submitted by The_Brutally_Honest to gpumining [link] [comments]

Debunking myths about mining and GPUs

E: Going to bed, will contribute more tomorrow. Thanks for the discussion!
Myth: Mining is more stressful than gaming. Fact: It depends. During the old days, this was plausible, because older GPUs (Pre-polaris) are/were bottlenecked by core clock when mining the most profitable coins. Thus, miners overclocked and overvolted these cards quite frequently, especially with cheap electricity. This meant that those cards were often run hot, pushing the limits and stressing VRM and fans quite a lot. Nowadays, ethash (Ethereum) is the most profitable algorithm for AMD cards 99% of the time, and newer GPUs (Polaris) are limited by memory bandwidth and latency. Miners can underclock core to the low 1100MHz range before seeing performance drop. To save power, miners who know what they are doing also undervolt, since it is no longer necessary to sustain a high core clock. Thus, it is quite feasible to run polaris cards below 70C at a reasonable fan speed. However, dual mining (mining more than one coin at once) does increase power consumption by up to 20%, and there are also idiots who run their polaris cards OCd while mining. With the exception of a few idiots, miners treat their Polaris GPUs pretty much the same; that is, running underclocked and undervolted 24/7 with a memory strap mod and mem OC. On the other hand, former gaming cards are highly variable in use cases. Some gamers leave their cards at stock settings, some undervolt, and some OC and/or overvolt. Most of the time, these cards are thermal cycled far more often than mining cards, which is known to weaken solder. Another thing to consider is that manufacturers have learned (somewhat) from their mistakes of putting shit tier fans in GPUs, and many fans on modern GPUs are ball bearing and/or swappable. Even some budget cards, such as MSI Armor, use decent ball bearing fans. Bottom line: the risk of buying mined Polaris cards is not as high as the risk of buying older mined cards. I would not be against buying mined polaris cards, but it's not necessarily better than buying a gamer's card instead. At the end of the day, it depends more on how the owner treated it than what they used it for.
Myth: GPUs are obsolete because of FPGAs and ASICs Fact: Mostly false. Older algorithms such as scrypt and SHA256 (lite/doge/feathebitcoin etc) are no longer feasible to mine with GPUs, but there have been multiple algorithms since then that are built to deter ASICs; most of the time it is done by making it memory-hard because designing an ASIC with high memory throughput is considerably more expensive to design and manufacture. Many devs prefer their blockchain to be ASIC resistant to avoid the concentration of power problem that Bitcoin is having nowadays, where a giant, near-monopolistic ASIC manufacturer (Bitmain) is causing a lot of (subjective) controversy. Blockchains based on ethash (Ethereum and its forks), equihash (Zcash and its forks) and cryptonight (Monero and forks) are some examples, but there are scores of other shitcoins and a few other algos that are GPU dominant. It is almost impossible that there will be another ASIC takeover, which is what was responsible for the stop in GPU demand in the bitcoin and litecoin days. Bottom line: ASICs no longer threaten GPU miners, or the demand for GPUs
Myth: Ethereum switching to Proof of Stake will kill mining soon Fact: Doomsayers have been preaching about proof of stake since late 2015. It has always been "coming soon." The fact is, the Ethereum roadmap goes from proof of work (mining) -> Casper (mining + PoS) -> Metropolis (PoS). Currently, the release date of Casper is not even announced yet, nor is it being tested in a (public) testnet. Proof of Stake might one day take over, but mining is here to stay for a while yet. Another thing to consider is that there are tons of other GPU mineable blockchains, and although Ethereum is biggest, it is certainly feasible that mining stays profitable even after Ethereum goes PoS (if it ever does). However, it is possible that profits will be low enough to discourage new miners. Bottom line: It's very unlikely. E: I screwed up the roadmap; here is a better source than me with some interesting information: https://www.ethnews.com/ethereums-vitalik-buterin-gives-keynote-on-metropolis
Myth: The current Ethereum demand spike is a bubble Opinion: Honestly, I don't know. I would not be surprised if stricter regulations on ICOs come sooner or later, which would fuck with Ether prices. There is also the inherent volatility of cryptocurrencies. However, it is also possible that blockchain technology continues to gain traction; that is, the price could just as easily go up as go down. Although it's fun to read about other people's opinions, only time-travelling wizards can tell you when it will become economical again to upgrade your poor HD5770. Bottom line: No one knows.
Myth: Miners will "steal" all the RX Vegas Fact: Only a reckless miner would buy Vegas on release, since mining performance is not known. In fact, it is possible that it can't mine at all (or at some stupidly low speed) until devs add support to existing miners. It would be even more reckless than gamers who buy without seeing benchmarks, since at least gamers can expect the games to actually run. It's also not necessarily the case that Vega will be good once miners do add support. Maybe there will be enough reckless miners to affect supply, maybe not. Of course, it is possible that miners will deplete the supply after it is demonstrated that Vega is good for mining. Bottom line: Most miners won't preorder, but it's possible that a significant number will. E: Important to remember that even if mining demand isn't high, doesn't mean that supply will be plentiful.
Myth: Nvidia cards SUCK at mining Fact: Mostly false. They USED to suck in the old pre-Maxwell days, but now they are actually more efficient at mining Ethereum and Zcash compared to AMD cards, even after both cards are undervolted. The flipside is that they (used to) cost more for the equivalent hashrate. For reference, my old 5xRX470 rig drew just under 800W when mining ETH only and hashed at 150MH/s. My current 6xGTX1060 rig draws just over half of that (<450W) and hashes at about 135MH/s. Certainly not as good in raw performance, but they are viable nonetheless, especially given the AMD GPU shortage. In fact, Nvidia cards (1060 and especially 1070) are becoming scarce as well. Bottom line: Nvidia is still the underdog when it comes to mining, but far from irrelevant nowadays.
Myth: 4GB cards will be obsolete for mining soon Fact: FALSE. The Ethereum DAG is not even 3GB yet, and won't be for a few months. The recent reports of 4GB Polaris cards slowing down soon due to DAG size is caused by limited TLB capacity, not VRAM restrictions. Polaris cards will still be able to mine ETH forks such as Expanse and UBIQ without diminished speed, and even if they are used to mine ETH, it is not that much of a performance hit at first. It would certainly not make polaris useless or undesirable for mining anytime soon. Tahiti GPUs already suffer from this issue and Hawaii is the most resistant to this issue. Have not benched Nvidia at a later epoch.
Myth: Creating miner-bashing posts on Reddit will help alleviate the GPU supply problem Fact: False, you are simply giving cryptocurrencies and mining more exposure to the general public, increasing demand.
Myth: Mining-specific GPUs will solve the shortage problems Opinion: There's not enough info to tell yet, but I am a skeptic for the following reasons. First, no display limits the resale value of the card for obvious reasons. IMO, the whole point of crypto mining from a profitability standpoint is to have a hedge against coin volatility (hardware is still worth something if the coin crashes). Otherwise it is much less effort to just buy and hold the coin. If the hardware is useless without demand from other (significant) sources, then it doesn't make much sense to buy it unless the price is extremely low. I'm sure that cost-downing the PCB and warranty will make for a cheap card, but it has to be extremely cheap and plentiful in supply, or else miners will buy whatever they can get. I could envision "failed" chips (not meeting spec of consumer editions) being stuck in miner cards, but I doubt there are enough to meet demand without ramping up production as a whole, which carries its own risks. I guess that it would help a little, but probably not solve the problems. Alternatively, since modern GPUs are bottlenecked by RAM when mining, it might be enticing to miners to have the fastest (GDDR5) RAM on the market (probably the 9gbps chips from the 1060 6G 9gbps edition, although I don't have one to test). However, my previous points still apply; buying such a card without display outputs carries a big risk. Bottom line: It's not a great idea, unless they are super cheap or use really good RAM.
Hope this helped; if you have any further questions I will try to answer them. I'm both a gamer and miner who uses both AMD and Nvidia roughly equally and don't favor one group over another. I've mined and gamed on all high end AMD GPUs since Tahiti (except Tonga) and all Pascal cards except 1050ti.
submitted by key_smash to Amd [link] [comments]

Technical Cryptonight Discussion: What about low-latency RAM (RLDRAM 3, QDR-IV, or HMC) + ASICs?

The Cryptonight algorithm is described as ASIC resistant, in particular because of one feature:
A megabyte of internal memory is almost unacceptable for the modern ASICs. 
EDIT: Each instance of Cryptonight requires 2MB of RAM. Therefore, any Cryptonight multi-processor is required to have 2MB per instance. Since CPUs are incredibly well loaded with RAM (ie: 32MB L3 on Threadripper, 16 L3 on Ryzen, and plenty of L2+L3 on Skylake Servers), it seems unlikely that ASICs would be able to compete well vs CPUs.
In fact, a large number of people seem to be incredibly confident in Cryptonight's ASIC resistance. And indeed, anyone who knows how standard DDR4 works knows that DDR4 is unacceptable for Cryptonight. GDDR5 similarly doesn't look like a very good technology for Cryptonight, focusing on high-bandwidth instead of latency.
Which suggests only an ASIC RAM would be able to handle the 2MB that Cryptonight uses. Solid argument, but it seems to be missing a critical point of analysis from my eyes.
What about "exotic" RAM, like RLDRAM3 ?? Or even QDR-IV?

QDR-IV SRAM

QDR-IV SRAM is absurdly expensive. However, its a good example of "exotic RAM" that is available on the marketplace. I'm focusing on it however because QDR-IV is really simple to describe.
QDR-IV costs roughly $290 for 16Mbit x 18 bits. It is true Static-RAM. 18-bits are for 8-bits per byte + 1 parity bit, because QDR-IV is usually designed for high-speed routers.
QDR-IV has none of the speed or latency issues with DDR4 RAM. There are no "banks", there are no "refreshes", there are no "obliterate the data as you load into sense amplifiers". There's no "auto-charge" as you load the data from the sense-amps back into the capacitors.
Anything that could have caused latency issues is gone. QDR-IV is about as fast as you can get latency-wise. Every clock cycle, you specify an address, and QDR-IV will generate a response every clock cycle. In fact, QDR means "quad data rate" as the SRAM generates 2-reads and 2-writes per clock cycle. There is a slight amount of latency: 8-clock cycles for reads (7.5nanoseconds), and 5-clock cycles for writes (4.6nanoseconds). For those keeping track at home: AMD Zen's L3 cache has a latency of 40 clocks: aka 10nanoseconds at 4GHz
Basically, QDR-IV BEATS the L3 latency of modern CPUs. And we haven't even begun to talk software or ASIC optimizations yet.

CPU inefficiencies for Cryptonight

Now, if that weren't bad enough... CPUs have a few problems with the Cryptonight algorithm.
  1. AMD Zen and Intel Skylake CPUs transfer from L3 -> L2 -> L1 cache. Each of these transfers are in 64-byte chunks. Cryptonight only uses 16 of these bytes. This means that 75% of L3 cache bandwidth is wasted on 48-bytes that would never be used per inner-loop of Cryptonight. An ASIC would transfer only 16-bytes at a time, instantly increasing the RAM's speed by 4-fold.
  2. AES-NI instructions on Ryzen / Threadripper can only be done one-per-core. This means a 16-core Threadripper can at most perform 16 AES encryptions per clock tick. An ASIC can perform as many as you'd like, up to the speed of the RAM.
  3. CPUs waste a ton of energy: there's L1 and L2 caches which do NOTHING in Cryptonight. There are floating-point units, memory controllers, and more. An ASIC which strips things out to only the bare necessities (basically: AES for Cryptonight core) would be way more power efficient, even at ancient 65nm or 90nm designs.

Ideal RAM access pattern

For all yall who are used to DDR4, here's a special trick with QDR-IV or RLDRAM. You can pipeline accesses in QDR-IV or RLDRAM. What does this mean?
First, it should be noted that Cryptonight has the following RAM access pattern:
QDR-IV and RLDRAM3 still have latency involved. Assuming 8-clocks of latency, the naive access pattern would be:
  1. Read
  2. Stall
  3. Stall
  4. Stall
  5. Stall
  6. Stall
  7. Stall
  8. Stall
  9. Stall
  10. Write
  11. Stall
  12. Stall
  13. Stall
  14. Stall
  15. Stall
  16. Stall
  17. Stall
  18. Stall
  19. Read #2
  20. Stall
  21. Stall
  22. Stall
  23. Stall
  24. Stall
  25. Stall
  26. Stall
  27. Stall
  28. Write #2
  29. Stall
  30. Stall
  31. Stall
  32. Stall
  33. Stall
  34. Stall
  35. Stall
  36. Stall
This isn't very efficient: the RAM sits around waiting. Even with "latency reduced" RAM, you can see that the RAM still isn't doing very much. In fact, this is why people thought Cryptonight was safe against ASICs.
But what if we instead ran four instances in parallel? That way, there is always data flowing.
  1. Cryptonight #1 Read
  2. Cryptonight #2 Read
  3. Cryptonight #3 Read
  4. Cryptonight #4 Read
  5. Stall
  6. Stall
  7. Stall
  8. Stall
  9. Stall
  10. Cryptonight #1 Write
  11. Cryptonight #2 Write
  12. Cryptonight #3 Write
  13. Cryptonight #4 Write
  14. Stall
  15. Stall
  16. Stall
  17. Stall
  18. Stall
  19. Cryptonight #1 Read #2
  20. Cryptonight #2 Read #2
  21. Cryptonight #3 Read #2
  22. Cryptonight #4 Read #2
  23. Stall
  24. Stall
  25. Stall
  26. Stall
  27. Stall
  28. Cryptonight #1 Write #2
  29. Cryptonight #2 Write #2
  30. Cryptonight #3 Write #2
  31. Cryptonight #4 Write #2
  32. Stall
  33. Stall
  34. Stall
  35. Stall
  36. Stall
Notice: we're doing 4x the Cryptonight in the same amount of time. Now imagine if the stalls were COMPLETELY gone. DDR4 CANNOT do this. And that's why most people thought ASICs were impossible for Cryptonight.
Unfortunately, RLDRAM3 and QDR-IV can accomplish this kind of pipelining. In fact, that's what they were designed for.

RLDRAM3

As good as QDR-IV RAM is, its way too expensive. RLDRAM3 is almost as fast, but is way more complicated to use and describe. Due to the lower cost of RLDRAM3 however, I'd assume any ASIC for CryptoNight would use RLDRAM3 instead of the simpler QDR-IV. RLDRAM3 32Mbit x36 bits costs $180 at quantities == 1, and would support up to 64-Parallel Cryptonight instances (In contrast, a $800 AMD 1950x Threadripper supports 16 at the best).
Such a design would basically operate at the maximum speed of RLDRAM3. In the case of x36-bit bus and 2133MT/s, we're talking about 2133 / (Burst Length4 x 4 read/writes x 524288 inner loop) == 254 Full Cryptonight Hashes per Second.
254 Hashes per second sounds low, and it is. But we're talking about literally a two-chip design here. 1-chip for RAM, 1-chip for the ASIC/AES stuff. Such a design would consume no more than 5 Watts.
If you were to replicate the ~5W design 60-times, you'd get 15240 Hash/second at 300 Watts.

RLDRAM2

Depending on cost calculations, going cheaper and "making more" might be a better idea. RLDRAM2 is widely available at only $32 per chip at 800 MT/s.
Such a design would theoretically support 800 / 4x4x524288 == 95 Cryptonight Hashes per second.
The scary part: The RLDRAM2 chip there only uses 1W of power. Together, you get 5 Watts again as a reasonable power-estimate. x60 would be 5700 Hashes/second at 300 Watts.
Here's Micron's whitepaper on RLDRAM2: https://www.micron.com/~/media/documents/products/technical-note/dram/tn4902.pdf . RLDRAM3 is the same but denser, faster, and more power efficient.

Hybrid Cube Memory

Hybrid Cube Memory is "stacked RAM" designed for low latency. As far as I can tell, Hybrid Cube memory allows an insane amount of parallelism and pipelining. It'd be the future of an ASIC Cryptonight design. The existence of Hybrid Cube Memory is more about "Generation 2" or later. In effect, it demonstrates that future designs can be lower-power and give higher-speed.

Realistic ASIC Sketch: RLDRAM3 + Parallel Processing

The overall board design would be the ASIC, which would be a simple pipelined AES ASIC that talks with RLDRAM3 ($180) or RLDRAM2 ($30).
Its hard for me to estimate an ASIC's cost without the right tools or design. But a multi-project wafer like MOSIS offers "cheap" access to 14nm and 22nm nodes. Rumor is that this is roughly $100k per run for ~40 dies, suitable for research-and-development. Mass production would require further investments, but mass production at the ~65nm node is rumored to be in the single-digit $$millions or maybe even just 6-figures or so.
So realistically speaking: it'd take ~$10 Million investment + a talented engineer (or team of engineers) who are familiar with RLDRAM3, PCIe 3.0, ASIC design, AES, and Cryptonight to build an ASIC.

TL;DR:

submitted by dragontamer5788 to Monero [link] [comments]

Cryptocurrency Mining History : Journey to PoC

Cryptocurrency just like any other technological development has given birth to many side industries and trends like ICO, white paper writing, and mining etc… just the cryptocurrency itself rises, falls and changes to adapt real life conditions, so does its side industries and trends. Today we are going to be focusing on mining. How it has risen, fell and adapted through the journey of cryptocurrency till date.
Without going into details crypto mining is the process by which new blocks are validated and added to the blockchain. It first took to main stream in January 2009 when the mysterious Satoshi Nakamoto launched the bitcoin white paper within which he/she/they proposed the first mining consensus mechanism called proof of work (Pow).
The PoW consensus mechanism required that one should spend a certain amount of computational power to solve a cryptographic problem (nounce) in other to have the have the right to pack/verify the next block on the blockchain. In this mechanism, the more computational power one possesses the more rights they have over the packing of the next block. The quest for faster hardware has seen significant changes in the types of hard ware dominating the PoW mining community.
Back in 2009 when bitcoin first started a normal pc and its processing power worked just fine. In fact a pc with an i7 Intel processor could mine up to 50btc per day but back then it almost nothing since btc was only some few cents. When the difficulty of the network became significantly high, simple computer processing units could not match the competitiveness and so miners settled for something more powerful, the high end graphic processors (GPU). This is when the era of rigs began It was in 2010. People would combine GPUs together in mining rigs on a mother board usually in order of 6 per rig some miners operated farms containing many of these rigs. Of course with greater power came greater network difficulty and so the search for faster hard ware let to implementation of Field Programmable Gate Arrays (FPGA) in June 2012. A further search for faster, less consuming and cheaper hard ware let us to where we are today. In the year 2013, Application Specific Integrated Circuits (ASIC) miners were introduced. One ASIC miner processes 1500H/s which is 100 times processing power of CPU and GPU. But all this speed and efficiency achievements brought about another problem one which touches the core of cryptocurrency itself. The idea of decentralization was gradually fading away as wealthy and big companies are the once who could afford and build the miners therefore centralizing mining around the rich, there was a called for ASIC resistant consensus mechanism.
A movement for ASIC resistant PoW algorithms began the idea is to make ASIC mining impossible or at least make it such that using ASIC doesn’t give a miner any additional advantage as to using CPU . In 2013 the MONERO the famous privacy coin proposed CryptoNight an ASIC resistant PoW consensus at least that is how they intended it to be. But things have proven much more difficult in practice than they had anticipated as ASIC producers keep matching up to every barrier put in place the PoW designers at a rate faster than it takes to build these barriers. MONERO for example has to fork every now and then in other to keep the CryptoNight ASIC resistant a trick which is still not working as reported by their CEO “We [also] saw that this was very unsustainable. … It takes a lot to keep [hard forking] again and again for one. For two, it may decentralize mining but it centralizes in another area. It centralizes on the developers because now there’s a lot of trust in developers to keep hard forking.” Another PoW ASIC resistance algorithm is the RamdonX and there are many others but could quickly imagine that the barriers to ASIC mining in these ASIC resistance algorithm would eventually be broken by the ASIC miners and so a total shift from PoW mining to other consensus mechanisms which are ASIC resistance from core were proposed some of which are in use today.
Entered the Proof of Stake (PoS) consensus mechanism. PoS was first introduced in 2013 by the PeerCoin team. Here, a validator’s right to mine is proportionate to his/heit economic value in the network simple put the more amounts of coins you have the more mining rights you get. Apart from PeerCoin, NEO and LISK also use POS and soon to follow is EThereum. There are different variations to PoS including but not limited to delegated proof of stake DPoS, masternode proof of stake MPoS each of which seek to improve on something in the POS. This is a very good ASIC resistance consensus mechanism but it still doesn’t solves the centralization problem as the rich always have the power to more coins and have more mining rights plus it is also expensive to start. And then we have gotten many other proposals to combat this among which are Proof of Weight (PoW) and Proof of Capacity (PoC). We take more interest in PoC it is the latest and gives the best solution to all our mining challenges consensus as of now.
Proof of Capacity was first was described 2013 in the Proofs of Space paper by Dziembowski, Faust, Kolmogorov and Pietrzak and it is now being used in Burst. The main factor that separates all the mining mechanisms is the resource used. These resources which miners spend in other to have mining rights is a measure of ensuring that one has expense a none-trivial amount of effort in making a statement. The resource being spent in PoC is disk space. This is less expensive since many people already have some unused space lying around and space is a cheap resource in the field of tech. it has no discrimination over topography… it really solves lots of centralized problems present in all most other consensus. If the future is now then one could say the future of crypto mining is PoC.
submitted by seekchain to u/seekchain [link] [comments]

Crypto and the Latency Arms Race: Crypto Exchanges and the HFT Crowd

Crypto and the Latency Arms Race: Crypto Exchanges and the HFT Crowd


News by Coindesk: Max Boonen
Carrying on from an earlier post about the evolution of high frequency trading (HFT), how it can harm markets and how crypto exchanges are responding, here we focus on the potential longer-term impact on the crypto ecosystem.
First, though, we need to focus on the state of HFT in a broader context.

Conventional markets are adopting anti-latency arbitrage mechanisms

In conventional markets, latency arbitrage has increased toxicity on lit venues and pushed trading volumes over-the-counter or into dark pools. In Europe, dark liquidity has increased in spite of efforts by regulators to clamp down on it. In some markets, regulation has actually contributed to this. Per the SEC:
“Using the Nasdaq market as a proxy, [Regulation] NMS did not seem to succeed in its mission to increase the display of limit orders in the marketplace. We have seen an increase in dark liquidity, smaller trade sizes, similar trading volumes, and a larger number of “small” venues.”
Why is non-lit execution remaining or becoming more successful in spite of its lower transparency? In its 2014 paper, BlackRock came out in favour of dark pools in the context of best execution requirements. It also lamented message congestion and cautioned against increasing tick sizes, features that advantage latency arbitrageurs. (This echoes the comment to CoinDesk of David Weisberger, CEO of Coinroutes, who explained that the tick sizes typical of the crypto market are small and therefore do not put slower traders at much of a disadvantage.)
Major venues now recognize that the speed race threatens their business model in some markets, as it pushes those “slow” market makers with risk-absorbing capacity to provide liquidity to the likes of BlackRock off-exchange. Eurex has responded by implementing anti-latency arbitrage (ALA) mechanisms in options:
“Right now, a lot of liquidity providers need to invest more into technology in order to protect themselves against other, very fast liquidity providers, than they can invest in their pricing for the end client. The end result of this is a certain imbalance, where we have a few very sophisticated liquidity providers that are very active in the order book and then a lot of liquidity providers that have the ability to provide prices to end clients, but are tending to do so more away from the order book”, commented Jonas Ullmann, Eurex’s head of market functionality. Such views are increasingly supported by academic research.
XTX identifies two categories of ALA mechanisms: policy-based and technology-based. Policy-based ALA refers to a venue simply deciding that latency arbitrageurs are not allowed to trade on it. Alternative venues to exchanges (going under various acronyms such as ECN, ATS or MTF) can allow traders to either take or make, but not engage in both activities. Others can purposefully select — and advertise — their mix of market participants, or allow users to trade in separate “rooms” where undesired firms are excluded. The rise of “alternative microstructures” is mostly evidenced in crypto by the surge in electronic OTC trading, where traders can receive better prices than on exchange.
Technology-based ALA encompasses delays, random or deterministic, added to an exchange’s matching engine to reduce the viability of latency arbitrage strategies. The classic example is a speed bump where new orders are delayed by a few milliseconds, but the cancellation of existing orders is not. This lets market makers place fresh quotes at the new prevailing market price without being run over by latency arbitrageurs.
As a practical example, the London Metal Exchange recently announced an eight-millisecond speed bump on some contracts that are prime candidates for latency arbitrageurs due to their similarity to products trading on the much bigger CME in Chicago.
Why 8 milliseconds? First, microwave transmission between Chicago and the US East Coast is 3 milliseconds faster than fibre optic lines. From there, the $250,000 a month Hibernia Express transatlantic cable helps you get to London another 4 milliseconds faster than cheaper alternatives. Add a millisecond for internal latencies such as not using FPGAs and 8 milliseconds is the difference for a liquidity provider between investing tens of millions in speed technology or being priced out of the market by latency arbitrage.
With this in mind, let’s consider what the future holds for crypto.

Crypto exchanges must not forget their retail roots

We learn from conventional markets that liquidity benefits from a diverse base of market makers with risk-absorption capacity.
Some have claimed that the spread compression witnessed in the bitcoin market since 2017 is due to electronification. Instead, I posit that it is greater risk-absorbing capacity and capital allocation that has improved the liquidity of the bitcoin market, not an increase in speed, as in fact being a fast exchange with colocation such as Gemini has not supported higher volumes. Old-timers will remember Coinsetter, a company that, per the Bitcoin Wiki , “was created in 2012, and operates a bitcoin exchange and ECN. Coinsetter’s CSX trading technology enables millisecond trade execution times and offers one of the fastest API data streams in the industry.” The Wiki page should use the past tense as Coinsetter failed to gain traction, was acquired in 2016 and subsequently closed.
Exchanges that invest in scalability and user experience will thrive (BitMEX comes to mind). Crypto exchanges that favour the fastest traders (by reducing jitter, etc.) will find that winner-takes-all latency strategies do not improve liquidity. Furthermore, they risk antagonising the majority of their users, who are naturally suspicious of platforms that sell preferential treatment.
It is baffling that the head of Russia for Huobi vaunted to CoinDesk that: “The option [of co-location] allows [selected clients] to make trades 70 to 100 times faster than other users”. The article notes that Huobi doesn’t charge — but of course, not everyone can sign up.
Contrast this with one of the most successful exchanges today: Binance. It actively discourages some HFT strategies by tracking metrics such as order-to-trade ratios and temporarily blocking users that breach certain limits. Market experts know that Binance remains extremely relevant to price discovery, irrespective of its focus on a less professional user base.
Other exchanges, take heed.
Coinbase closed its entire Chicago office where 30 engineers had worked on a faster matching engine, an exercise that is rumoured to have cost $50mm. After much internal debate, I bet that the company finally realised that it wouldn’t recoup its investment and that its value derived from having onboarded 20 million users, not from upgrading systems that are already fast and reliable by the standards of crypto.
It is also unsurprising that Kraken’s Steve Hunt, a veteran of low-latency torchbearer Jump Trading, commented to CoinDesk that: “We want all customers regardless of size or scale to have equal access to our marketplace”. Experience speaks.
In a recent article on CoinDesk , Matt Trudeau of ErisX points to the lower reliability of cloud-based services compared to dedicated, co-located and cross-connected gateways. That much is true. Web-based technology puts the emphasis on serving the greatest number of users concurrently, not on serving a subset of users deterministically and at the lowest latency possible. That is the point. Crypto might be the only asset class that is accessible directly to end users with a low number of intermediaries, precisely because of the crypto ethos and how the industry evolved. It is cheaper to buy $500 of bitcoin than it is to buy $500 of Microsoft shares.
Trudeau further remarks that official, paid-for co-location is better than what he pejoratively calls “unsanctioned colocation,” the fact that crypto traders can place their servers in the same cloud providers as the exchanges. The fairness argument is dubious: anyone with $50 can set up an Amazon AWS account and run next to the major crypto exchanges, whereas cheap co-location starts at $1,000 a month in the real world. No wonder “speed technology revenues” are estimated at $1 billion for the major U.S. equity exchanges.
For a crypto exchange, to reside in a financial, non-cloud data centre with state-of-the-art network latencies might ironically impair the likelihood of success. The risk is that such an exchange becomes dominated on the taker side by the handful of players that already own or pay for the fastest communication routes between major financial data centres such as Equinix and the CME in Chicago, where bitcoin futures are traded. This might reduce liquidity on the exchange because a significant proportion of the crypto market’s risk-absorption capacity is coming from crypto-centric funds that do not have the scale to operate low-latency strategies, but might make up the bulk of the liquidity on, say, Binance. Such mom-and-pop liquidity providers might therefore shun an exchange that caters to larger players as a priority.

Exchanges risk losing market share to OTC liquidity providers

While voice trading in crypto has run its course, a major contribution to the market’s increase in liquidity circa 2017–2018 was the risk appetite of the original OTC voice desks such as Cumberland Mining and Circle.
Automation really shines in bringing together risk-absorbing capacity tailored to each client (which is impossible on anonymous exchanges) with seamless electronic execution. In contrast, latency-sensitive venues can see liquidity evaporate in periods of stress, as happened to a well-known and otherwise successful exchange on 26 June which saw its bitcoin order book become $1,000 wide for an extended period of time as liquidity providers turned their systems off. The problem is compounded by the general unavailability of credit on cash exchanges, an issue that the OTC market’s settlement model avoids.
As the crypto market matures, the business model of today’s major cash exchanges will come under pressure. In the past decade, the FX market has shown that retail traders benefit from better liquidity when they trade through different channels than institutional speculators. Systematic internalizers demonstrate the same in equities. This fact of life will apply to crypto. Exchanges have to pick a side: either cater to retail (or retail-driven intermediaries) or court HFTs.
Now that an aggregator like Tagomi runs transaction cost analysis for their clients, it will become plainly obvious to investors with medium-term and long-term horizons (i.e. anyone not looking at the next 2 seconds) that their price impact on exchange is worse than against electronic OTC liquidity providers.
Today, exchange fee structures are awkward because they must charge small users a lot to make up for crypto’s exceptionally high compliance and onboarding costs. Onboarding a single, small value user simply does not make sense unless fees are quite elevated. Exchanges end up over-charging large volume traders such as B2C2’s clients, another incentive to switch to OTC execution.
In the alternative, what if crypto exchanges focus on HFT traders? In my opinion, the CME is a much better venue for institutional takers as fees are much lower and conventional trading firms will already be connected to it. My hypothesis is that most exchanges will not be able to compete with the CME for fast traders (after all, the CBOE itself gave up), and must cater to their retail user base instead.
In a future post, we will explore other microstructures beyond all-to-all exchanges and bilateral OTC trading.
Fiber threads image via Shutterstock
submitted by GTE_IO to u/GTE_IO [link] [comments]

Flashback Friday: When I though I was going to be rich :-)

I was going through my crypto archive and here a picture when time were not just better but unbelievably profitable for a hobby miner (and made me spend a bit too much money on hardware).

This was on 24 January 2018, and I was pulling 0,0026 BTC per day (back then it was 23ish EUR with the EUBTC rate) on 10 low-middle grade GPUs. This was without any kind of OC (I was going full blast, it was just so profitable, lol).
2 * rx570 + 8 Nvidia (3 * 1050ti + 5 1060 6Go).
Now I have 2 more GPUs, finely tuned OC, and I would get around 0,5mBTC/Day (below electricity cost in my region).
https://i.redd.it/ikgdsawsysk11.jpg
I guess when something is too good to be true, it's too good to be true. But I was drawn into it. I was always curious about Bitcoin when BTC was 200 euro but back then it wasn't profitable anyway already because of FPGA and ASIC and I wasn't really clear where to buy BTC that wouldn't scam me of my money. BTC went to 600 euro and I though things have peaked already so I was fuck it I'm too late for the party. I admit FOMO brought me in with the Bullrun end 2017.

Now, we are in september, back in DecembeJanuary I projected that I should BE my initial investment in July and start printing money by then. Right now, I'm something like <30% BE it and the value of my stack is still going down since I've HODL all of my BTC and never sold.

Nowadays, my rig is off because electricity is too expensive to make it run (Western Europe problem I know) so I can't even continue to progress to my BE even when electricity is cheaper at night. Tried mining ETH directly but the difficulty is ridiculous, the value took a nose dive, payout are unpredictably because of pool luck and pplns scheme, tried mining RVN but came late to the party pool luck and payement are even worse on RVN than on ETH.
- I would make more RVN mining ETH and buying RVN with ETH.
- I would also make more ETH mining on Nicehash and purchasing ETH with BTC.
But then, I would be trading so what's the point of mining.

I think I'll go deeper the rabbit hole and start mining speculative low diff coins if I can find a good project, but every project I hear from seems to already being past that point so I can't build a bag :-(

I'm happy I haven't got overboard with the spending and I'm glad I've learn many thing in this space along the way. So glad I didn't pull the trigger on a couple of Titan XP, I seriously considered it back when it was still profitable. But being in profit seems to be a concept that I'll probably never reach. So I'll guess "I'm in it for the tech" like so many people can say.

I guess this post will have the merit to warn potential adventurers to reconsider spending any amount of money they are not ready to be parted from.

If however you have "free electricity" I still wish you good luck because at a large scale you still won't BE the initial investment if you want to build today, and at a small scale, you won't make a buck anyway and just provide cheap hashrate to NH buyers.


submitted by Spykwak to NiceHash [link] [comments]

I own 179 BTC, here is my story

I am not a wealthy person by any means, but Bitcoin has helped.
I discovered Bitcoin via a post on overclock.net on April 27th, 2011. I believe the price was about $1.50/coin then. I read the posts about people mining them, did some research, and immediately started my Radeon card mining them. I had a 4770 back then.
There was an exchange to sell Bitcoins for linden dollars (Second Life currency) and then I could sell those for paypal dollars. Within a day I had proven to my wife that I could make money with this Bitcoin thing. Despite us being in a position where we couldn't even pay our credit cards, I took the $1100 we had and bought 4 5850's, some power supplies, and some cheap craigslist computers. I figured that if this whole Bitcoin thing failed miserably, at least I had some decent computer hardware I could resell and recover most of the cost. I immediately sold one 5850 for greater-than-market value since they were in demand and I needed the money, and started the other 3 mining. At one point, I was mining nearly 8 coins a day. I bought a few more cards as time went on and continued GPU mining for as long as it was viable.
This whole thing saved us financially. I was able to sell the Bitcoins and settle on my unpayable credit card debts. I held on to a few during the crash but managed to sell most of them at $10 or more, fortunately. After that I started saving them, since they were worth so little. I bought some of the early BFL FPGA miners, the ones that were measured in MHashes not GHashes. After mining with those for a while and then selling them to someone who wanted them more than I did, I had more than 450 BTC. I took the plunge and pre-ordered BFL's latest offerings, the 60GH singles, the day they were available, becoming one of the first on the preorder list. Little did I know I would have been much better off just holding those coins...
Regardless, I did eventually receive those singles, and managed to get about 225 BTC out of them before they were no longer worth running. I've been slowly selling the stash as we needed for remodel projects around the house and for miscellaneous expenses, though I finally no longer need to do so, as we've been able to pay off more debts and have more income than expenses each month. Now I've got a nice pile of savings, and I'm hoping to someday be able to use it to buy a better house in a better neighborhood.
I generally don't tell people that I have just about all my liquid assets in Bitcoin, as they would call me crazy. They might be right. But it's a risk I'm willing to take. I do have some equity in my house, and some retirement accounts, but neither is worth more than my BTC stash.
So that's MY story, what's yours?
submitted by bitcoinzzzz to Bitcoin [link] [comments]

A cryptocurrency (or crypto currency) is a digital asset

Blockchain

Main article: Blockchain
The validity of each cryptocurrency's coins is provided by a blockchain. A blockchain is a continuously growing list of records), called blocks, which are linked and secured using cryptography.[23][26] Each block typically contains a hash pointer as a link to a previous block,[26] a timestamp and transaction data.[27] By design, blockchains are inherently resistant to modification of the data. It is "an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way".[28] For use as a distributed ledger, a blockchain is typically managed by a peer-to-peer network collectively adhering to a protocol for validating new blocks. Once recorded, the data in any given block cannot be altered retroactively without the alteration of all subsequent blocks, which requires collusion of the network majority.
Blockchains are secure by design and are an example of a distributed computing system with high Byzantine fault tolerance. Decentralized consensus has therefore been achieved with a blockchain.[29] Blockchains solve the double-spendingproblem without the need of a trusted authority or central server), assuming no 51% attack (that has worked against several cryptocurrencies).

Timestamping

Cryptocurrencies use various timestamping schemes to "prove" the validity of transactions added to the blockchain ledger without the need for a trusted third party.
The first timestamping scheme invented was the proof-of-work scheme. The most widely used proof-of-work schemes are based on SHA-256 and scrypt.[16]
Some other hashing algorithms that are used for proof-of-work include CryptoNight, Blake), SHA-3, and X11#X11).
The proof-of-stake is a method of securing a cryptocurrency network and achieving distributed consensus through requesting users to show ownership of a certain amount of currency. It is different from proof-of-work systems that run difficult hashing algorithms to validate electronic transactions. The scheme is largely dependent on the coin, and there's currently no standard form of it. Some cryptocurrencies use a combined proof-of-work/proof-of-stake scheme.[16]

Mining

📷Hashcoin mine
In cryptocurrency networks, mining is a validation of transactions. For this effort, successful miners obtain new cryptocurrency as a reward. The reward decreases transaction fees by creating a complementary incentive to contribute to the processing power of the network. The rate of generating hashes, which validate any transaction, has been increased by the use of specialized machines such as FPGAs and ASICs running complex hashing algorithms like SHA-256 and Scrypt.[30] This arms race for cheaper-yet-efficient machines has been on since the day the first cryptocurrency, bitcoin, was introduced in 2009.[30] With more people venturing into the world of virtual currency, generating hashes for this validation has become far more complex over the years, with miners having to invest large sums of money on employing multiple high performance ASICs. Thus the value of the currency obtained for finding a hash often does not justify the amount of money spent on setting up the machines, the cooling facilities to overcome the enormous amount of heat they produce, and the electricity required to run them.[30][31]
Some miners pool resources, sharing their processing power over a network to split the reward equally, according to the amount of work they contributed to the probability of finding a block). A "share" is awarded to members of the mining pool who present a valid partial proof-of-work.
As of February 2018, the Chinese Government halted trading of virtual currency, banned initial coin offerings and shut down mining. Some Chinese miners have since relocated to Canada.[32] One company is operating data centers for mining operations at Canadian oil and gas field sites, due to low gas prices.[33] In June 2018, Hydro Quebec proposed to the provincial government to allocate 500 MW to crypto companies for mining.[34] According to a February 2018 report from Fortune,[35] Iceland has become a haven for cryptocurrency miners in part because of its cheap electricity. Prices are contained because nearly all of the country's energy comes from renewable sources, prompting more mining companies to consider opening operations in Iceland.[citation needed]
In March 2018, a town in Upstate New York put an 18-month moratorium on all cryptocurrency mining in an effort to preserve natural resources and the "character and direction" of the city.[36]

GPU price rise

An increase in cryptocurrency mining increased the demand of graphics cards (GPU) in 2017.[37] Popular favorites of cryptocurrency miners such as Nvidia's GTX 1060 and GTX 1070 graphics cards, as well as AMD's RX 570 and RX 580 GPUs, doubled or tripled in price – or were out of stock.[38] A GTX 1070 Ti which was released at a price of $450 sold for as much as $1100. Another popular card GTX 1060's 6 GB model was released at an MSRP of $250, sold for almost $500. RX 570 and RX 580 cards from AMD were out of stock for almost a year. Miners regularly buy up the entire stock of new GPU's as soon as they are available.[39]
Nvidia has asked retailers to do what they can when it comes to selling GPUs to gamers instead of miners. "Gamers come first for Nvidia," said Boris Böhles, PR manager for Nvidia in the German region.[40]

Wallets

📷An example paper printable bitcoin wallet consisting of one bitcoin address for receiving and the corresponding private key for spendingMain article: Cryptocurrency wallet
A cryptocurrency wallet stores the public and private "keys" or "addresses" which can be used to receive or spend the cryptocurrency. With the private key, it is possible to write in the public ledger, effectively spending the associated cryptocurrency. With the public key, it is possible for others to send currency to the wallet.

Anonymity

Bitcoin is pseudonymous rather than anonymous in that the cryptocurrency within a wallet is not tied to people, but rather to one or more specific keys (or "addresses").[41] Thereby, bitcoin owners are not identifiable, but all transactions are publicly available in the blockchain. Still, cryptocurrency exchanges are often required by law to collect the personal information of their users.
Additions such as Zerocoin, Zerocash and CryptoNote have been suggested, which would allow for additional anonymity and fungibility.[42][43]
submitted by TheResearcher012 to GreatLifePostsGoTeam [link] [comments]

AMD's Growing CPU Advantage Over Intel

https://seekingalpha.com/article/4152240-amds-growing-cpu-advantage-intel?page=1
AMD's Growing CPU Advantage Over Intel Mar. 1.18 | About: Advanced Micro (AMD)
Raymond Caron, Ph.D. Tech, solar, natural resources, energy (315 followers) Summary AMD's past and economic hazards. AMD's Current market conditions. AMD Zen CPU advantage over Intel. AMD is primarily a CPU fabrication company with much experience and a great history in that respect. They hold patents for 64-bit processing, as well as ARM based processing patents, and GPU architecture patents. AMD built a name for itself in the mid-to-late 90’s when they introduced the K-series CPU’s to good reviews followed by the Athlon series in ‘99. AMD was profitable, they bought the companies NexGen, Alchemy Semiconductor, and ATI. Past Economic Hazards If AMD has such a great history, then what happened? Before I go over the technical advantage that AMD has over Intel, it’s worth looking to see how AMD failed in the past, and to see if those hazards still present a risk to AMD. As for investment purposes we’re more interested in AMD’s turning a profit. AMD suffered from intermittent CPU fabrication problems, and was also the victim of sustained anti-competitive behaviour from Intel who interfered with AMD’s attempts to sell its CPU’s to the market through Sony, Hitachi, Toshiba, Fujitsu, NEC, Dell, Gateway, HP, Acer, and Lenovo. Intel was investigated and/or fined by multiple countries including Japan, Korea, USA, and EU. These hazard needs to be examined to see if history will repeat itself. There have been some rather large changes in the market since then.
1) The EU has shown they are not averse to leveling large fines, and Intel is still fighting the guilty verdict from the last EU fine levied against them; they’ve already lost one appeal. It’s conceivable to expect that the EU, and other countries, would prosecute Intel again. This is compounded by the recent security problems with Intel CPU’s and the fact that Intel sold these CPU’s under false advertising as secure when Intel knew they were not. Here are some of the largest fines dished out by the EU
2) The Internet has evolved from Web 1.0 to 2.0. Consumers are increasing their online presence each year. This reduces the clout that Intel can wield over the market as AMD can more easily sell to consumers through smaller Internet based companies.
3) Traditional distributors (HP, Dell, Lenovo, etc.) are struggling. All of these companies have had recent issues with declining revenue due to Internet competition, and ARM competition. These companies are struggling for sales and this reduces the clout that Intel has over them, as Intel is no longer able to ensure their future. It no longer pays to be in the club. These points are summarized in the graph below, from Statista, which shows “ODM Direct” sales and “other sales” increasing their market share from 2009 to Q3 2017. 4) AMD spun off Global Foundries as a separate company. AMD has a fabrication agreement with Global Foundries, but is also free to fabricate at another foundry such as TSMC, where AMD has recently announced they will be printing Vega at 7nm.
5) Global Foundries developed the capability to fabricate at 16nm, 14nm, and 12nm alongside Samsung, and IBM, and bought the process from IBM to fabricate at 7nm. These three companies have been cooperating to develop new fabrication nodes.
6) The computer market has grown much larger since the mid-90’s – 2006 when AMD last had a significant tangible advantage over Intel, as computer sales rose steadily until 2011 before starting a slow decline, see Statista graph below. The decline corresponds directly to the loss of competition in the marketplace between AMD and Intel, when AMD released the Bulldozer CPU in 2011. Tablets also became available starting in 2010 and contributed to the fall in computer sales which started falling in 2012. It’s important to note that computer shipments did not fall in 2017, they remained static, and AMD’s GPU market share rose in Q4 2017 at the expense of Nvidia and Intel.
7) In terms of fabrication, AMD has access to 7nm on Global Foundries as well as through TSMC. It’s unlikely that AMD will experience CPU fabrication problems in the future. This is something of a reversal of fortunes as Intel is now experiencing issues with its 10nm fabrication facilities which are behind schedule by more than 2 years, and maybe longer. It would be costly for Intel to use another foundry to print their CPU’s due to the overhead that their current foundries have on their bottom line. If Intel is unable to get the 10nm process working, they’re going to have difficulty competing with AMD. AMD: Current market conditions In 2011 AMD released its Bulldozer line of CPU’s to poor reviews and was relegated to selling on the discount market where sales margins are low. Since that time AMD’s profits have been largely determined by the performance of its GPU and Semi-Custom business. Analysts have become accustomed to looking at AMD’s revenue from a GPU perspective, which isn’t currently being seen in a positive light due to the relation between AMD GPU’s and cryptocurrency mining.
The market views cryptocurrency as further risk to AMD. When Bitcoin was introduced it was also mined with GPU’s. When the currency switched to ASIC circuits (a basic inexpensive and simple circuit) for increased profitability (ASIC’s are cheaper because they’re simple), the GPU’s purchased for mining were resold on the market and ended up competing with and hurting new AMD GPU sales. There is also perceived risk to AMD from Nvidia which has favorable reviews for its Pascal GPU offerings. While AMD has been selling GPU’s they haven’t increased GPU supply due to cryptocurrency demand, while Nvidia has. This resulted in a very high cost for AMD GPU’s relative to Nvidia’s. There are strategic reasons for AMD’s current position:
1) While the AMD GPU’s are profitable and greatly desired for cryptocurrency mining, AMD’s market access is through 3rd party resellers whom enjoy the revenue from marked-up GPU sales. AMD most likely makes lower margins on GPU sales relative to the Zen CPU sales due to higher fabrication costs associated with the fabrication of larger size dies and the corresponding lower yield. For reference I’ve included the size of AMD’s and Nvidia’s GPU’s as well as AMD’s Ryzen CPU and Intel’s Coffee lake 8th generation CPU. This suggests that if AMD had to pick and choose between products, they’d focus on Zen due higher yield and revenue from sales and an increase in margin.
2) If AMD maintained historical levels of GPU production in the face of cryptocurrency demand, while increasing production for Zen products, they would maximize potential income for highest margin products (EPYC), while reducing future vulnerability to second-hand GPU sales being resold on the market. 3) AMD was burned in the past from second hand GPU’s and want to avoid repeating that experience. AMD stated several times that the cryptocurrency boom was not factored into forward looking statements, meaning they haven’t produced more GPU’s to expect more GPU sales.
In contrast, Nvidia increased its production of GPU’s due to cryptocurrency demand, as AMD did in the past. Since their Pascal GPU has entered its 2nd year on the market and is capable of running video games for years to come (1080p and 4k gaming), Nvidia will be entering a position where they will be competing directly with older GPU’s used for mining, that are as capable as the cards Nvidia is currently selling. Second-hand GPU’s from mining are known to function very well, with only a need to replace the fan. This is because semiconductors work best in a steady state, as opposed to being turned on and off, so it will endure less wear when used 24/7.
The market is also pessimistic regarding AMD’s P/E ratio. The market is accustomed to evaluating stocks using the P/E ratio. This statistical test is not actually accurate in evaluating new companies, or companies going into or coming out of bankruptcy. It is more accurate in evaluating companies that have a consistent business operating trend over time.
“Similarly, a company with very low earnings now may command a very high P/E ratio even though it isn’t necessarily overvalued. The company may have just IPO’d and growth expectations are very high, or expectations remain high since the company dominates the technology in its space.” P/E Ratio: Problems With The P/E I regard the pessimism surrounding AMD stock due to GPU’s and past history as a positive trait, because the threat is minor. While AMD is experiencing competitive problems with its GPU’s in gaming AMD holds an advantage in Blockchain processing which stands to be a larger and more lucrative market. I also believe that AMD’s progress with Zen, particularly with EPYC and the recent Meltdown related security and performance issues with all Intel CPU offerings far outweigh any GPU turbulence. This turns the pessimism surrounding AMD regarding its GPU’s into a stock benefit. 1) A pessimistic group prevents the stock from becoming a bubble. -It provides a counter argument against hype relating to product launches that are not proven by earnings. Which is unfortunately a historical trend for AMD as they have had difficulty selling server CPU’s, and consumer CPU’s in the past due to market interference by Intel. 2) It creates predictable daily, weekly, monthly, quarterly fluctuations in the stock price that can be used, to generate income. 3) Due to recent product launches and market conditions (Zen architecture advantage, 12nm node launching, Meltdown performance flaw affecting all Intel CPU’s, Intel’s problems with 10nm) and the fact that AMD is once again selling a competitive product, AMD is making more money each quarter. Therefore the base price of AMD’s stock will rise with earnings, as we’re seeing. This is also a form of investment security, where perceived losses are returned over time, due to a stock that is in a long-term upward trajectory due to new products reaching a responsive market.
4) AMD remains a cheap stock. While it’s volatile it’s stuck in a long-term upward trend due to market conditions and new product launches. An investor can buy more stock (with a limited budget) to maximize earnings. This is advantage also means that the stock is more easily manipulated, as seen during the Q3 2017 ER.
5) The pessimism is unfounded. The cryptocurrency craze hasn’t died, it increased – fell – and recovered. The second hand market did not see an influx of mining GPU’s as mining remains profitable.
6) Blockchain is an emerging market, that will eclipse the gaming market in size due to the wide breath of applications across various industries. Vega is a highly desired product for Blockchain applications as AMD has retained a processing and performance advantage over Nvidia. There are more and rapidly growing applications for Blockchain every day, all (or most) of which will require GPU’s. For instance Microsoft, The Golem supercomputer, IBM, HP, Oracle, Red Hat, and others. Long-term upwards trend AMD is at the beginning of a long-term upward trend supported by a comprehensive and competitive product portfolio that is still being delivered to the market, AMD referred to this as product ramping. AMD’s most effective products with Zen is EPYC, and the Raven Ridge APU. EPYC entered the market in mid-December and was completely sold out by mid-January, but has since been restocked. Intel remains uncompetitive in that industry as their CPU offerings are retarded by a 40% performance flaw due to Meltdown patches. Server CPU sales command the highest margins for both Intel and AMD.
The AMD Raven Ridge APU was recently released to excellent reviews. The APU is significant due to high GPU prices driven buy cryptocurrency, and the fact that the APU is a CPU/GPU hybrid which has the performance to play games available today at 1080p. The APU also supports the Vulcan API, which can call upon multiple GPU’s to increase performance, so a system can be upgraded with an AMD or Nvidia GPU that supports Vulcan API at a later date for increased performance for those games or workloads that been programmed to support it. Or the APU can be replaced when the prices of GPU’s fall.
AMD also stands to benefit as Intel confirmed that their new 10 nm fabrication node is behind in technical capability relative to the Samsung, TSMC, and Global Foundries 7 nm fabrication process. This brings into questions Intel’s competitiveness in 2019 and beyond. Take-Away • AMD was uncompetitive with respect to CPU’s from 2011 to 2017 • When AMD was competitive, from 1996 to 2011 they did record profit and bought 3 companies including ATI. • AMD CPU business suffered from: • Market manipulation from Intel. • Intel fined by EU, Japan, Korea, and settled with the USA • Foundry productivity and upgrade complications • AMD has changed • Global Foundries spun off as an independent business • Has developed 14nm &12nm, and is implementing 7nm fabrication • Intel late on 10nm, is less competitive than 7nm node • AMD to fabricate products using multiple foundries (TSMC, Global Foundries) • The market has changed • More AMD products are available on the Internet and both the adoption of the Internet and the size of the Internet retail market has exploded, thanks to the success of smartphones and tablets. • Consumer habits have changed, more people shop online each year. Traditional retailers have lost market share. • Computer market is larger (on-average), but has been declining. While Computer shipments declined in Q2 and Q3 2017, AMD sold more CPU’s. • AMD was uncompetitive with respect to CPU’s from 2011 to 2017. • Analysts look to GPU and Semi-Custom sales for revenue. • Cryptocurrency boom intensified, no crash occurred. • AMD did not increase GPU production to meet cryptocurrency demand. • Blockchain represents a new growth potential for AMD GPU’s. • Pessimism acts as security against a stock bubble & corresponding bust. • Creates cyclical volatility in the stock that can be used to generate profit. • P/E ratio is misleading when used to evaluate AMD. • AMD has long-term growth potential. • 2017 AMD releases competitive product portfolio. • Since Zen was released in March 2017 AMD has beat ER expectations. • AMD returns to profitability in 2017. • AMD taking measureable market share from Intel in OEM CPU Desktop and in CPU market. • High margin server product EPYC released in December 2017 before worst ever CPU security bug found in Intel CPU’s that are hit with detrimental 40% performance patch. • Ryzen APU (Raven Ridge) announced in February 2018, to meet gaming GPU shortage created by high GPU demand for cryptocurrency mining. • Blockchain is a long-term growth opportunity for AMD. • Intel is behind the competition for the next CPU fabrication node. AMD’s growing CPU advantage over Intel About AMD’s Zen Zen is a technical breakthrough in CPU architecture because it’s a modular design and because it is a small CPU while providing similar or better performance than the Intel competition.
Since Zen was released in March 2017, we’ve seen AMD go from 18% CPU market share in the OEM consumer desktops to essentially 50% market share, this was also supported by comments from Lisa Su during the Q3 2017 ER call, by MindFactory.de, and by Amazon sales of CPU’s. We also saw AMD increase its market share of total desktop CPU’s. We also started seeing market share flux between AMD and Intel as new CPU’s are released. Zen is a technical breakthrough supported by a few general guidelines relating to electronics. This provides AMD with an across the board CPU market advantage over Intel for every CPU market addressed.
1) The larger the CPU the lower the yield. - Zen architecture that makes up Ryzen, Threadripper, and EPYC is smaller (44 mm2 compared to 151 mm2 for Coffee Lake). A larger CPU means fewer CPU’s made during fabrication per wafer. AMD will have roughly 3x the fabrication yield for each Zen printed compared to each Coffee Lake printed, therefore each CPU has a much lower cost of manufacturing.
2) The larger the CPU the harder it is to fabricate without errors. - The chance that a CPU will be perfectly fabricated falls exponentially with increasing surface area. Intel will have fewer high quality CPU’s printed compared to AMD. This means that AMD will make a higher margin on each CPU sold. AMD’s supply of perfect printed Ryzen’s (1800X) are so high that the company had to give them away at a reduced cost in order to meet supply demands for the cheaper Ryzen 5 1600X. If you bought a 1600X in August/September, you probably ended up with an 1800X.
3) Larger CPU’s are harder to fabricate without errors on smaller nodes. -The technical capability to fabricate CPU’s at smaller nodes becomes more difficult due to the higher precision that is required to fabricate at a smaller node, and due to the corresponding increase in errors. “A second reason for the slowdown is that it’s simply getting harder to design, inspect and test chips at advanced nodes. Physical effects such as heat, electrostatic discharge and electromagnetic interference are more pronounced at 7nm than at 28nm. It also takes more power to drive signals through skinny wires, and circuits are more sensitive to test and inspection, as well as to thermal migration across a chip. All of that needs to be accounted for and simulated using multi-physics simulation, emulation and prototyping.“ Is 7nm The Last Major Node? “Simply put, the first generation of 10nm requires small processors to ensure high yields. Intel seems to be putting the smaller die sizes (i.e. anything under 15W for a laptop) into the 10nm Cannon Lake bucket, while the larger 35W+ chips will be on 14++ Coffee Lake, a tried and tested sub-node for larger CPUs. While the desktop sits on 14++ for a bit longer, it gives time for Intel to further develop their 10nm fabrication abilities, leading to their 10+ process for larger chips by working their other large chip segments (FPGA, MIC) first.” There are plenty of steps where errors can be created within a fabricated CPU. This is most likely the culprit behind Intel’s inability to launch its 10nm fabrication process. They’re simply unable to print such a large CPU on such a small node with high enough yields to make the process competitive. Intel thought they were ahead of the competition with respect to printing large CPU’s on a small node, until AMD avoided the issue completely by designing a smaller modular CPU. Intel avoided any mention of its 10nm node during its Q4 2017 ER, which I interpret as bad news for Intel shareholders. If you have nothing good to say, then you don’t say anything. Intel having nothing to say about something that is fundamentally critical to its success as a company can’t be good. Intel is on track however to deliver hybrid CPU’s where some small components are printed on 10nm. It’s recently also come to light that Intel’s 10nm node is less competitive than the Global Foundries, Samsung, and TSMC 7nm nodes, which means that Intel is now firmly behind in CPU fabrication. 4) AMD Zen is a new architecture built from the ground up. Intel’s CPU’s are built on-top of older architecture developed with 30-yr old strategies, some of which we’ve recently discovered are flawed. This resulted in the Meltdown flaw, the Spectre flaws, and also includes the ME, and AMT bugs in Intel CPU’s. While AMD is still affected by Spectre, AMD has only ever acknowledged that they’re completely susceptible to Spectre 1, as AMD considers Spectre 2 to be difficult to exploit on an AMD Zen CPU. “It is much more difficult on all AMD CPUs, because BTB entries are not aliased - the attacker must know (and be able to execute arbitrary code at) the exact address of the targeted branch instruction.” Technical Analysis of Spectre & Meltdown * Amd Further reading Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel's 'garbage' patches | ZDNet FYI: Processor bugs are everywhere - just ask Intel and AMD Meltdown and Spectre: Good news for AMD users, (more) bad news for Intel Cybersecurity agency: The only sure defense against huge chip flaw is a new chip Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign Take-Away • AMD Zen enjoys a CPU fabrication yield advantage over Intel • AMD Zen enjoys higher yield of high quality CPU’s • Intel’s CPU’s are affected with 40% performance drop due to Meltdown flaw that affect server CPU sales.
AMD stock drivers 1) EPYC • -A critically acclaimed CPU that is sold at a discount compared to Intel. • -Is not affected by 40% software slow-downs due to Meltdown. 2) Raven Ridge desktop APU • - Targets unfed GPU market which has been stifled due to cryptocurrency demand - Customers can upgrade to a new CPU or add a GPU at a later date without changing the motherboard. • - AM4 motherboard supported until 2020. 3) Vega GPU sales to Intel for 8th generation CPU’s with integrated graphics. • - AMD gains access to the complete desktop and mobile market through Intel.
4) Mobile Ryzen APU sales • -Providing gaming capability in a compact power envelope.
5) Ryzen and Threadripper sales • -Fabricated on 12nm in April. • -May eliminate Intel’s last remaining CPU advantage in IPC single core processing. • -AM4 motherboard supported until 2020. • -7nm Ryzen on track for early 2019. 6) Others: Vega, Polaris, Semi-custom, etc. • -I consider any positive developments here to be gravy. Conclusion While in the past Intel interfered with AMD's ability to bring it's products to market, the market has changed. The internet has grown significantly and is now a large market that dominates when in computer sales. It's questionable if Intel still has the influence to affect this new market, and doing so would most certainly result in fines and further bad press.
AMD's foundry problems were turned into an advantage over Intel.
AMD's more recent past was heavily influenced by the failure of the Bulldozer line of CPU's that dragged on AMD's bottom line from 2011 to 2017.
AMD's Zen line of CPU's is a breakthrough that exploits an alternative, superior strategy, in chip design which results in a smaller CPU. A smaller CPU enjoys compounded yield and quality advantages over Intel's CPU architecture. Intel's lead in CPU performance will at the very least be challenged and will more likely come to an end in 2018, until they release a redesigned CPU.
I previously targeted AMD to be worth $20 by the end of Q4 2017 ER. This was based on the speed that Intel was able to get products to market, in comparison AMD is much slower. I believe the stock should be there, but the GPU related story was prominent due to cryptocurrency craze. Financial analysts need more time to catch on to what’s happening with AMD, they need an ER that is driven by CPU sales. I believe that the Q1 2018 is the ER to do that. AMD had EPYC stock in stores when the Meltdown and Spectre flaws hit the news. These CPU’s were sold out by mid-January and are large margin sales.
There are many variables at play within the market, however barring any disruptions I’d expect that AMD will be worth $20 at some point in 2018 due these market drivers. If AMD sold enough EPYC CPU’s due to Intel’s ongoing CPU security problems, then it may occur following the ER in Q1 2018. However, if anything is customary with AMD, it’s that these things always take longer than expected.
submitted by kchia124 to AMD_Stock [link] [comments]

High memory FPGAs? Other newbie questions.

Background: My degree is in computer science. I've worked various jobs in IT related fields. I have programmed in a smattering of languages (C/Java/etc) and mostly worked at an application layer and up. In school I did take several computer hardware classes, and in one we built our own 16-bit MIPS CPU (with pipelining) in a circuit simulator that one of the professors had written himself in Java. We created our own DRAM cells, NAND/AND/XOetc gates, we created our own shifters and multiplexers, stuff like that. But again...in a Java GUI, not anything practical like verilog.
End goal: I like performance. I like to tinker and make things go faster. I missed the bitcoin (and altcoin) FPGA rush by about 2-3 years. Currently, GPU mining is dominated by so-called 'ASIC resistant' algorithms which mostly resist ASICs by requiring a large amount of RAM (800MB-2GB, depending on algo) as part of the algorithm.
Partly as a novelty for my personal use, and partly so that I have at least some level of knowledge of yet another IT field I can go into should I ever get laid off or sick of my current job, I want to learn how to program an FPGA for the purposes of implementing at least one of these ASIC-resistant algorithms. The energy savings over a GPU would be nice as well, as from what I can see most FPGAs are at least an order of magnitude, if not two, more efficient than a GPU.
So, on to my questions:
  1. Xilinx seems to be the number one 'newbie' FPGA maker (I say this because I see them mentioned more frequently than any other maker in threads foby newbies). Apparently I can download their software kit and use their simulator to design and test in software; I don't need the actual FPGA circuit/card until I am ready to put my design 'into production'. Am I correct in thinking this? Are there other manufacturers I should target instead?
  2. Given my school experience designing circuits at something like a UML or CAD level, are there any recommended paths I should look into, or is learning VHDL/Verilog a pretty hard requirement to implementing any design? I'm asking because I'm ignorant.
  3. Maybe best asked for later, but would the Xilinx simulator give me some idea of which FPGA I'd want to buy? Will it give me anything approaching an 'expected behavior' ('hey, it runs at 1h/s on your CPU, but on our Model XYZ it will run around 300-400 h/s!') Or will there be anything useful in helping me pick FPGAs? Because...I already looked, and holyfuckballs there are a shitload of FPGA models and makers and ohgod how do I pick from this literal shitstorm of models?!
  4. Are high memory FPGAs a thing? I saw some stuff on Intel's website that indicated they had a (very expensive) FPGA that also had DDR4 slots built in. That would solve the memory issue. Do I actually need that though? Would it be the case that even a cheap (~$100) FPGA would end up having enough cells to implement 1GB of memory? I don't really care if I end up having a massive grid of registers, apart from the fact that, well, logic not used for computation is wasted energy and silicon space. But maybe 1GB of memory ends up being super tiny on an FPGA? Again, I understand how to build 1GB of memory in circuits, but not on an FPGA or in silicon.
submitted by Kingdud to FPGA [link] [comments]

Satoshi increased the block size by 100,000%!

...sort of... in a manner of speaking.
In mid-2010, when the average block size was about 1KB, Satoshi put in the 1MB block size limit because the project he had worked on for years was finally gaining real traction. People were using it!
Everyone had to run a full node in order to have a wallet and send bitcoins, so this led to risk. The risk from someone mining a series of large blocks: say, 5, 10 or 20MB. BTC was very cheap then, and miners had no money, compared to the huge amounts today invested in hardware for which they want ROI, investments large enough for them to want Bitcoin to succeed.
FPGA mining was just being discussed in mid-2010. What if a rogue miner got a head-start and kept hammering out spam-blocks killing off the fledgling interest in Bitcoin? Of course, he did not make the limit 2KB. That would quickly cripple usage, and blow out confirmation times! So, Satoshi capped the size at 1MB,
Did the blocks immediately fill to the 1MB level? No. And they won't after raising the limit a mere 800%, or even doubling that every two years.
Satoshi knew the 1MB was a risk to his long-term vision for a global competitor to the likes of VISA, but he did massive code changes, and frequently. He just assumed it would be easily increased or removed when the time was right.
The time is right.
submitted by solex1 to Bitcoin [link] [comments]

The $22,484.00 Butterfly Labs Mini Rig bitcoin miner is a huge, broken, unstable piece of shit.

(This was a rather controversial article posted on Buttcoin.org and became quite popular, even moving to the top of /bitcoin. It's since been mysteriously edited on the site [maybe by g-g-g-ghosts!] so it's being reposted here for posterity's sake. Some numbers may be off by now, but it was all accurate at the time of posting.)
Butterfly Labs has a long and horrible history with their mining rigs. They started taking pre-orders over a year ago, with a ship time sometime in late July. After numerous delays in production, shipping problems and general incompetence, the only thing they’ve managed to get out the door are some of their tiniest miners, the Jalapenos. And those mainly ended up in the hands of reviewers and blogs in order to keep pumping the Butterfly Labs hype train and securing millions of dollars of pre-orders still in limbo.Lucky BFL forums user Luke-JR however scored a sweet Mini Rig from Butterfly Labs (it’s just a coincidence he’s a driver developer for them I’m sure). This rig was originally promised to produce 1500 GH/s hashing power at 1500 watts for $30,000, but has since seen it’s hashing power slashed to a third of what was promised and it’s power consumption increased 75%, now just offer 500 GH/s at 2400 watts. They’ve promised to make good on pre-order buy sending out 3 rigs to match the initial hashing rate, so now it’s only 1500 GH/s at 6900 watts, a reduction in GH/Watt by a factor of 5.
So what does $22,484 buy you? Take a look!
Minirig is here! Today, my Minirig arrived.
http://i.imgur.com/Yp0WPvE.jpg
FedEx apparently dropped it somewhere along the way, and the weakest part of the case, the thin metal part around the back of the PSU, broke.
http://i.imgur.com/lFcOHxP.jpg
I’m not sure how sturdy the back side was supposed to be, but its two pieces aren’t quite together either.
http://i.imgur.com/AVttcOt.jpg
The power supplies (EVGA 1500W) also created havoc interfering with the neutral on the power line. This disrupted X10 communication significantly enough that the pool overflowed because the system controlling it was unable to turn off the pump. Workaround: This PSU supports 240V, so we rewired the outlet. 240V does not use neutral, so now all should be okay.
Edit: 240V workaround is only partial. Still having problems
But the good news is, it all seems to be working for the most part.
Next up, installing it in the window so the heat goes outside
A twenty two thousand dollar box of electronics that is broken out of the box, that required the guy to do a sketchy electrical workaround to get partially working, that he is going to install in a window… and he’s happy about it?
In case you didn’t notice it, the delivered unit is different than the picture on the website. They had to install 2 power supplies instead of 1 and had to modify the case to fit. Also, if you didn’t notice, the LCD/Phone thingy in the front has been replaced by … a piece of cardboard spray painted black. Wonderful.
You could maybe chalk this up to a careless Fedex postman, but when you’re shipping something that costs as much as a mid-sized sedan, how bought putting a little more effort into packing? Dell and HP can ship bigger and heavier servers across the world without this kind of problem.
The unit had to hit its huge power draw increase by putting dual EVGA consumer grade power supplies in the unit. We’re talking almost a 75 amp load (6*1500/120), disregarding power factor. He could very well overload the circuit panel and trip the main breaker for the house.
Let’s take a look inside this guy.
This is from an earlier version of the Minirig (note the single power supply) This is apparently from an earlier FPGA but it will give you a good glimpse at what kind of craftsmanship you can expect from a computer that is half the average household income in the United States.
Consumer grade PSU and cheap USB hubs glued to the inside case.
Electrical tape and random velcro glued to the insides
A closer look at the USB hubs. Plugs are hot glued to stay secured.
Electrical tape everywhere, splices and voided hardware are the theme.
You can view the entire album here.
Despite all that, this thing can still mine bitcoins and it should be profitable. Keep in ind that many people jumped in on the preorders a year ago when bitcoins were still hovering around $6.50 per. Meaning customers paid 1562 bitcoins for that particular piece of shit, which at today’s value is $156,200. Aston martin money. How long will it take them to make their money back (as apposed to just hanging on to them)? If the difficulty didn’t change, they would make 37 bitcoins a day and recoup the initial investment in 124 days. Difficulty is jumping pretty much 20% every 12 days or so, so in the next week before adjustment, they’ll make 259, the next 12 days 369, the next 12 days 312, then 256, then 213, etc.
So by day 127, they’ll be halfway to breaking even, but by day 151 they’ll be making less than 5 bitcoins a day, and even if difficulty stopped rising at that point(which it won’t), it would take another 435 days for a total of 586 days to break even. If difficulty kept rising at the same pace, by day 200 they’d be making 2.4 bitcoins per day, and it would take 1024 days to break even with no difficulty increase. Assuming 25 cents per kw/h, and $100 a bitcoin, it would cost 0.43 of a bitcoin per day in electricity which means the unit would no longer be profitable on a power usage basis by day 307, at which point it will have produced 2620 bitcoins.
Bear in mind this is only for the first few units, and that’s running 24/7 pumping out around 24,000 BTU, so yes, medical bills from heat stroke will be on top of that.
But Alas, the chips don’t run nearly as well as they’re supposed to, frequently running too hot and giving multiple hardware failures. Coindesk noted in one of the first ever runs of the Minirig by hosting provide gigavps that it was running much too hot and erroring out.
At the time of posting, gigavps warned that the unit would be repeatedly shut down while ckolivas, who was assisting, modified the machine’s software to optimise performance. After some tweaking, the device was said to have been left to run continuously for two hours, and was shown to have an average hash rate of 478.1 GH/s. As you can see in the table below, ASIC number four (of a total of eight hashing chips) ran significantly hotter (86 degrees) and consequently gave the highest hardware (HW) error rate.
http://i.imgur.com/q3iGrnb.jpg
So, what happens if you just decide you don’t want this, you don’t want to wait over a year to get a $22,000 broken piece of shit? Nothing, because BFL won’t let you cancel your preorder because they’re now “shipping”, i.e. they sent out one unit to their own company shill.
http://i.imgur.com/0p3Up03.jpg
Which is of course illegal regardless of what Butterfly Labs may say.
So in summary: Don’t buy anything from Butterfly Labs … ever.
submitted by borderpatrol to Buttcoin [link] [comments]

Current state of hardware for ether mining?

I read this post from a year ago briefly discussing how it's inefficient to mine ether on a high-end laptop.
I'm aware that there are bitcoin farms in the middle of nowhere (cheap electricity) using FPGAs to mine bitcoins as efficiently as possible, and was wondering whether the same applies to ether?
In some of Vitalik's talks, he repeatedly states how he'd like for the network to be supported by general purpose commodity hardware; such as phones and laptops.
On a high level, is this still possible?
submitted by Olshansk to ethereum [link] [comments]

Hashrate of Altera's SHA256 "megafunction" on Hardcopy FPGAs

I'm an electrical engineer by qualification but I've spent the past 20 years in software development, so I'm a bit rusty on the eleceng side these days. But I still know enough to get myself into trouble :)
I've been looking into available SHA256 algorithms and found a datasheet on Altera's SHA256 "megafunction" here. It's a SHA256 implementation that can be used in ASIC and FPGA designs.
The datasheet mentions that the processing of one 512-bit block is performed in 66 clock cycles and the bit-rate achieved is 7.75Mbps / MHz.
There's a table showing throughput on various Altera chips. The Hardcopy HC210F484C runs at a maximum clock frequency of 152MHz, permitting a throughput of 1152Mbps with the algorithm.
Given the Bitcoin hashing algorithm is a double SHA256 operation, it would appear that this chip can deliver a hashing rate of approximately 576Mh/s. Would this be correct?
A very quick check on the price of these chips here shows they're anywhere from USD$0.01 to USD$9.90, depending on quantity.
If the above is all true and I haven't made any mistakes, this could make for a very cheap mining rig. Can anyone verify my analysis or have I made huge mistakes?
UPDATE: Hashrate is 152MHz / (2*66) cycles = 1.15Mh/s. Thanks to deeper-blue and MaybeJustNothing.
submitted by zappso to Bitcoin [link] [comments]

Reprogram bitcoin mining boards?

Anybody know how hard it would be to repurpose the (relatively) cheap high-capacity bitcoin mining boards on ebay to general-purpose FPGA experimentation? I'm interested doing some DSP-like things with them (assuming they're small enough that the free toolsets still work with them.)
submitted by redhairedscot to FPGA [link] [comments]

[Table] IAmA: My name is Ryan Ackroyd and back in 2011 I was arrested for my part in the groups know as Lulz Security (LulzSec), Anonymous and Operation Anti Security (#AntiSec). I am LulzSec, AMA!

Verified? (This bot cannot verify AMAs just yet)
Date: 2014-04-03
Link to submission (Has self-text)
Questions Answers
What was it like being raided? GAME OVER PLEASE INSERT COINS.
Do you have any regrets about your involvement with those groups? Snowden, hero or villain? Why? There's no point regretting anything because you cannot undo the past. I heard about Snowden in prison, came on the news but I only know what the news has been telling me. I will look more in to it, he's got balls I can tell you that and I admire a good set of balls.
Really? Good set of balls? Are we not doing phrasing anymore? Yes really.
Any advice on learning how to program? I'm struggling with Python at the moment. You don't learn how to program, you learn how use a programming language, programming is what you do with that information. Read lots of examples of source code, see if you can work out whats going on. Don't just copy/paste the examples, write them word for word no matter how long they are. Not only will it help understand what it is doing better you may also learn a few tricks.
That's deep man. Thanks for the advice. No problem.
You don't learn how to program, you learn how use a programming language. Damn, pretty sweet quote there. It's also true.
How did you get started 'hacking'? I first started back in 1998 I mostly used to hack PC games, hex editing, binary reverse engineering stuff. Then I learned how to use C and assembly and it just went from there.
What do you recommend for someone that's interested in hacking(in terms of learning it)? Hacking is the art of reverse engineering, before you can reverse engineer something you need to have an in depth knowledge of it.
What are your thoughts on mass surveillance? Any advice for the masses? It's been going on longer than you or I have existed. Strong encryption is the key.
Ever considered a job with the NSA? GCHQ? I don't think I would pass the security clearance now anyways. I am x military though so who knows.
I am surprised you are allowed to use computers or the internet at all now considering some of the parole conditions of many American hackers. Can you talk about some of the restrictions in the SCPO and why you feel they are unfair? Basically it prevents me from using encryption that allows hidden volumes, virtual machines and I cannot delete my internet history. It lasts for 5 years and each breach of the order can be punishable up to 5 years.
It's not a case of them being fair or unfair. I can't go too in to detail about it if I want to take it back to court, I just want to challenge it, plus I like to keep my cards close to my chest.
So would say firefox's private browsing mode be against the rules? I don't think so, I'd have to check the src of firefox to determine if it deletes the history in private mode or if it simply does not record it in anyway. The order says "Not to delete".
So if you just image a fresh install of your variety of os(debian?) everyday would that break your parole? I could just boot live OS and not be in breach.
Is that what you do? I used to, remember slax?
Are you proud of the sacrifices you made in your lifetime? Yes I am, I never look back I just move forward in to the future hoping to find peace and happiness.
Did you meet anyone interesting in prison? There's some funny characters in prison I can tell you that, made friends for life. Most of the people in there were in there for fraud, drugs, gang stuff. I never met any famous prisoners if that's what you mean? Then again everyone's story in prison is interesting.
What is the best way to protect your digital identity, and secure your privacy? Since you have experience exploiting these vulnerabilities, what do you suggest to prevent it? Strong encryption, limit the amount of personal information you place on the internet, don't draw attention, only use a sandboxed browser and tunnel your HTTPS through SSH using the highest of encryption, PGP mail for every mail, don't store encryption keys on your computer, full disk encryption and/or hidden OS, use different emails for your banking, paypal etc, Mak3p4$5w0rdZ!nCr3d!b4LlYsTr0nG, uninstall flash use Link to youtube.com instead etc, etc...
Thanks for your answer. I am not worried about personal use, I am more worried about enterprise and or security systems. preventing people from hacking my companies website, and social media accounts. I have to keep in compliance with the financial standards, so we use PGP, and Erado, for archiving and encryption. It is just nice to get the perspective what is an easy target, what would you consider a difficult target not worth pursuing? I could write a book to answer this...
Security isn't something you can just apply it's an on going process and in most cases you are relying on products sold to you on a trust basis. You're told that Antivirus software will stop you getting a virus but what you're not told is that Antivirus software only protects you against what is known, takes less than an hour to write a new piece of malware or backdoor that your AV will never detect.
People put too much faith in to security systems without understanding how they work, their capabilities or limitations.
There's no need to be "pentested" it's a right con, they only test you against old exploits, if you update regular then you're already safe.
Use 2 factor authentication and/or strong password policy, prevent users creating the own password (they will only make them simpler)
Just make your data worthless to a hacker by encrypting it.
Insightful, thank you. I know it's a vague question, but you just confirmed my intuition. Thanks for your honest answers. No problem.
Was it worth going to prison? Nothing is worth going to prison for. Yeah they are happy I am back and I am happy to see them again. It was ~100GBP for my parents to come and visit me in prison, I was a long way from home. I told them not to waste their money because the visit only lasts 2 hours.
How is your family treating you, are they happy to have you back? Did they visit you in prison? I am x military I am used to being away from home for long periods of time. It also makes time fly faster when you're not constantly reminded about home. If you don't know what you are missing then you don't miss much.
I've never been to prison, but I lost about 8 years of my life due to bad medical advice. I used to be angry-- I was thinking why wasn't it only 2 years-- why did it have to be such a complete waste of so many years? Would that have happened if it had been only 2 years instead of 8? I don't think so. Sometimes, events like that helps to put things in to perspective. We often don't think about our lives on a day to day basis but when something out of the "norm" happens it makes us think that little more.
What do you think about bitcoin? where will it be in 10 years? I think it's great, scrypt coin too! I think they will get heavily regulated soon though. I had 78 bitcoins backed up on a SDHC before I was sent to prison. Back then they were worth ~$40 each. Come out of jail and I can't find that shit anywhere lol.
The only problem is even if I do find it, it's encrypted with keys that are on the computer the police confiscated and I can't get back.
I know right?
Sue them. They stole you $ 78k. It's killing me!
If they've not destroyed your computer or had a seize order against it then you can get back things like that. Ask your lawyer. They seized it in court, I can however apply "for a copy of evidence" under UK law and they would have to give me a copy of the HDD.
Do this. Those bitcoins won't last. How long untill they are orphaned? It's been ~3 years already.
No idea. I meant they won't last as a concept (especially not at that value). Wow, only just seen it but they peaked at ~$1000? Feels bad man.
Exactly - since then the Chinese and Russians have banned it, and you can only consider it's a downward spiral from there. I wonder how long it will last here in the west, I'm surprised it hasn't already been regulated the tax man must be pissed!
I think there are also more constructive uses for mining, I mentioned this on twitter also. See there are a lot of cancer researchers and scientists that have HUGE amounts of data that needs to be mined to help cure diseases and solve the worlds problems.
The world hash rate for crypto coins is more powerful than most of the worlds super computers combined. Cancer researchers should give coins for mining cancer research data instead of wasting time generating meaningless hashes.
Do you like sporks? Yes, I do, do you?
Hey man we are asking YOU the questions here. Leave the interrogations to us the professionals. Ok bro.
Ex Army eh? Signals? or Tech in REME? I was in the Infantry.
So how were you ID'd for arrest? I fucked up, I've already explained it on here, lurk moar.
Just curious, do you think big brother is reading this ama real time or at all? They will be on this like flies round shit.
Where does/did LulzSec congregate? What is the education background of a typical member, or from where did you learn to do what you can do? We had our own private IRCD and encrypted silc servers, also channels on public servers. I am completely self taught, I have been at it since 1998.
Isn't the true "lulzy" part of this that you guys bugged some people and mocked their security and then ended up getting busted and going to prison while the companies you annoyed made their security better? That's pretty lulzy to me. What I find lulzy is the fact they were all vulnerable to webapp vulnerabilites and don't use PGP.
Wasn't LulzSec responsible for the major Sony Online Entertainment security breach? I know you wanted to retaliate because SONY was suing George Holtz for the PS3 jailbreaking thing he did, but the only thing that happened is that you inconvenienced the people. Basically I am asking. Why did you ruin all those accounts, which lead to people having their credit cards stolen, instead of just attacking SONY directly. Also, how many accounts were compromised? LulzSec says 1000, SONY says ~4000. LulzSec had nothing to do with the PSN breach, fact.
Do you suspect who was responsible? (group name, not actual name) Honestly, no Idea there is too much internet rumor and no one has officially come forward to claim it. I read some people in Spain were arrested and questioned in relation to the PSN breach but that still doesn't mean it was them.
What are your political views? I don't really have any political views, however I would like to see the rest of the world get a chance to sample democracy and be free from oppressive dictatorships (no naming names).
At first welcome back. So here an total haxolulz/past unrelated question what kind of music do you like? I like Hard Dance music, Hard Trance/Hard House.
Fuck yeah! Trance and House all the way. I used to have a set of Technics 1210 Mk2 and mix like mad when I was younger, still got some of the vinyl haha.
Hackers soundtrack, orbitals by halicon. Not hard but one of my favorites. Thanks I'll check this out. The LuLzSec hacker song by ytcracker is the ONLY hacker song haha.Link to www.youtube.com
Who is your favorite Trance artist? Ever been to any festivals? Tidy boys, nothing but. Their events are awesome, youtube "Tidy weekender"
How came you guys got the cia down? It was hit with a 100,000 node botnet.
What does this mean in layman's terms? 100,000 computers were used to send ALOT of data to the server causing it crash.
That's as simple as it gets.
Any advice for aspiring hacktivists? Trust no one.
Still on 4chan? fav boards? Rules 1&2 always apply.
Hate to break it to you bro, but 4chan is just full of kids now. The rules of the internet have somewhat faded since you left the intertubes in 2011 :( consequences will never be the same again. Ebaums world has been dead a longtime because people broke 1&2
What's your opinion on the book by Parmy Olson? I think I recall she said she talked to you by chat, but how accurate was what eventually made it into the book? I lost the book before I could read it all so I don't know...
What do you think your job prospects are now with the conviction hanging over your head? Not too good! I'm on my arse at the minuet, I'll do any job!
Any job? ( ͡° ͜ʖ ͡°) Allmost...
Write a book. Is it really needed?
Not needed per se. But you can tell an autobiographical perspective that nobody else can. The story lulz & Anon supporters would want to hear. Obviously not a 'tell all' but a 'tell most'? Yeah I get what you mean.
Was there a time before you were caught that you were worried or thought the authorities might have a clue as to your real identity? I knew I was going to get caught, I fucked up.
Did you have a backup plan? There was no backup plan but I realized the mistake early and acted accordingly! No I do not own guns, they're a lot more restricted here in the UK.
Are you still in touch with Jason? Saw he went to prison recently, I actually went to HS with him. How long did he get?
Jeremy, not Jason. Twins. Yeah, I think he got 10 years or something like that? I've not heard anything from him since I got out, hope he is ok.
I look up to you on how you managed to convince everyone you're a girl, smart and funny thing at the same time. Really I look up to you like a celebrity, hah. Maintaining a strong persona such as kayla and doing what I did wasn't easy. If I could go back in time I wouldn't change anything, I'd just keep going back in time to relive it over and over again. Nothing is worth prison time, it's a waste of life however I did manage to get some qualifications out of it. Not many people can say they left prison with a diploma for the thing they went in for.
If you could go back in time, before the creation of Lulzsec, what would you do differently? Also, was Lulzsec's missions worth the prison time? Where did you learn social engineering, and hacking? I learned to do what I do from programming, it is simply reverse engineering.
Thank you for your answer. What do you think about the Syrian Electronic Army, their methods of attacks, and their meaning? I honestly believe they are state sponsored (possibly not from the beginning) but I do believe they are the people Assad is turning to. As for their methods I don't even know what methods they are using, I'll read in to it though, I'm fresh out of prison and most of what they have done was done while I had no internet access. I've been banned from the internet since 2011.
That they are, got many of Advanced Replacements on products. If you had to choose between, Windows or OSX which would you choose. I'd choose Windows, I'm not a fan of OSX.
This is probably the longest I've ever seen someone take questions in an AMA. Thank you for dedicating so much time to answering people's questions. No problem I enjoy answering questions and I will be here until the questions dry up.
How did it feel being able to get behind a computer for a first time in 2+ years? Did you build a rig to re-live the novelty or do anything particularly special? Porn, me thinks. I was able to use computers in prison. I was given a job as a class room assistant helping teach people about computers, basic things. I also managed to complete a number of computer related qualifications and gained a diploma. Seeing the internet after 2+ years was kind of strange though. The internet looks a lot different now compared to how it did in 2011. To be honest I don't like it, it has that "iphone look" to it all, big buttons and lots of unneeded JS. I wish I could build a rig I just do not have the money to do so. Hopefully soon I will find a nice job and maybe then I will build one.
Did you have internet in jail? No they do not allow internet in UK prisons.
Do you think access to the internet should be a human right? When will it be? Yes it should be and I am sure it is already so!
Here you go Link to www.wired.co.uk
My question: Did you ever have any suspicions about Sabu? At any point did you guys think, Something is up with this dude? Everyone had suspicions about everyone, things were being patched, rootkits disappearing, logs being leaked etc..
Was prison anything like you had expected it to be? Better than I thought it would be, I had most prison movies flashing in my head on the way there from court but it's nothing like that.
Could you elaborate more on what it's like please? Once I got to prison I realized it's nothing like the movies. Because it was all over the news when I was sentenced everyone in the jail knew who I was and what I was in for. I've made some good friends out it, people I am going to meet again once they are free. No one was ever violent towards me, you don't give any attitude you don't get any attitude. I did see some people get fucked up in there, mostly starts over something really silly, like a game of pool.
How did you pass the time? I used to pass the time playing chess with my cell mate or by watching all crazy people in there.
Did you personally encounter much violence in the prison? Or is it easy enough to stay away from it if you don't rock the boat or whatever? Well, most of the prison movies I have seen all show people stood at the bars, big guys growling "mmm fresh meat" and everyone getting stabbed up every 2 minuets. When you've never been to prison before you can't help but think back to every movie you've seen about prison.
When do you think the govt will evolve enough to have guys like you helping to grow our society? Right now it seeks to hunt you down...terrible waste of resources imo It is, there's a lot of talent rotting away in the prison system.
Administrations that in one breath expresses the need for engineers of digital space and then issues the order to capture the very people with such talents. Edit: its been many administrations haha :*( And many more to come!
Its wishful thinking but i hope that the economic imperative will win out as we progress to what can be depicted best by the Type I civilization coined by popular theoretical physicist Michio Kaku. Things need to change, that's for sure.
Is Milhouse a meme? Milhouse was the best meme!!
Do you like Wheatus? No, Wheatus sucks.
How do they ? Well I just like to suck wheats thats all.
What do you think - in Wrestlemania '98, did Mike Tyson get punched in the cock? Never watched it back then, I hope so!
Are you now like Gavin Orsay (Jimmi Simpson) on House of Cards? Never watched it, only just got out of prison I've heard a lot about that series, I'll look in to it.
Gavin in House of Cards has a guinea pig. Reminded me of "Noodles". :3. Hahaha.
the character Gavin is this hacker and there's a scene where he tells his FBI handler that they have to drop charges against Barrett Brown. Link to www.youtube.com Really?
Indeed. And check out who advised on the new season of HoC :3. Link to www.theguardian.com. HAHAHAHAHA.
Do they let you have PlayStation in British prisons? Link to www.dailymail.co.uk. They do if you buy the PlayStation.
What did you have in your cell? TV, Kettle, Radio, Bed, locker and that's it... I wasn't bothered about a play station, just used to play a lot of chess and poker.
What sort of people did you meet in association with the hacking groups? What kind of people are hackers? Out of everyone I have met there are only a small handful of people that I would put in to the "reverse engineehacker" bracket. Lots of people who know how to use exploits, not many who know how and why they work or are able to write their own, truly is a dieing art.
Do you ever think about trying to get hired for security for websites or for government computers or something along those lines? Again it's probably a trust issue...
Personally, I was very stupid and didn't really care about the ease with which personal information can spread, and didn't know how widespread NSA surveillance was. Given that a lot of stuff is out there already, if someone wanted to take the time to look at it, do you have any advice as to how to minimize the damage? Strong encryption is the key but to be honest, Internet Service Providers should be doing more to protect it's customers privacy, they should allow AES256 pipes for every customer as standard.
Many (fairly intelligent) people I talk to agree that they are concerned about surveillance and privacy on the internet, but they're resistant to try and use encryption or anything because they have it in their heads that only supergeniuses can understand computers like that. Do you have any suggestions for introducing these concepts to slightly technophobic people? I think encryption needs to be simplified for the regular user. It can get a bit confusing if you have never used it before or understand it's significance.
Also, wanted to thank you for your advice to the person below asking about how to learn programming. I know a little introductory Python, but I never really knew how to go from there to creating and understanding a whole big project. Programming languages too, learning new things is never easy, it's about determination and the will to succeed. The best way to learn is to read lots of source code and try to work out what is going on, maybe you could even improve on it?
How did you learn to hack? From learning to program.
Hacking is reverse engineering, it's an in depth knowledge of the underlying system/code/hardware/technology and the ability to take it apart piece by piece and reassemble it in to something else.
How long did it take you to become a proficient programmer? How did you learn? How long did it take you to become a proficient reverse engineer? How did you learn? I just learned from doing, trial and error. If there was something that I did not understand I would just research it, it sometimes helps to read it from more than one source. I used to just read lots of examples and try to make sense of them, if I found something new or something I did not understand I would research it.
As for time, I am unsure... I am still learning now, it is an on going process. All I can say is learn the basics and be creative, if there's something you do not understand research it, google always has the answer.
Have you only just got out of prison? I have been out a few weeks but I have only been on the internet a few days now.
Ever hear from anyone in that channel? Just one, those times are sadly gone.
When you got V&, they said two men where arrested for using the Kayla alias in some media. What happened there? The other person was my brother, they were unsure if it was him or me.
Ahh, You're poor brother... what happened when you got raided, how many people, where did they take you? I was home alone, they arrested me and took me to the police station for questioning. They told me during questioning that they had also been to arrest my brother.
Is human body very good build pc with very cheap soft?If it's so easy to exploit:( No it's a design prone to failure.
I want to learn Python 3. Codecademy is Python 2 (like all good tutorials...they seem to be for 2). Thanks for the links. I'll look into them. At the minuet I am going over logic, reading into some Verilog and other HDL because I want to experiment with some Field Programmable Gate Arrays (FPGA).
Hey its Kayla! Hahaha.
Hey thanks for the advice in the thread . I'll definitely use that on my quest to getting my Computer science thread and furthering my programming language knowledge ! No problem.
Kayla, I just want to say thank you. Haha no problem.
Meanwhile, while you were in prison our gov't set up a fake twitter type service in Cuba with the intent of helping topple the Cuban gov't. -_- I'd like to see how you have come to this conclusion? Or is it official and that is it's official purpose?
I just find it ironic that while prosecuting with one hand the other is going to extreme lengths to topple a government with social media. It's usually the case.
You should try this at /netsec. Possibly, no way to link them over? I'll admit it, this is the first time I've ever used or been to reddit.
They would probably have a few more knowledgeable comments and questions. Doesn't seem like any of them took the bait!
Yea, go to /netsec and create a new post, and make it a link (to the url of this thread). Make the title basically the same, but bring up the fact that it's an xpost. Thanks, I'll do this after I've made a coffee!
My question: why didn't you guys get ahead of the game by targeting Google. They're the resident evil today. Care to explain?
I think he meant Umbrella Corp. They have their hands in everything, and are mercilessly expanding. Fist umbrella fuck with zombies, now google!
#tr0ll. Hahaha, I'd like to believe it but...
Last updated: 2014-04-07 22:43 UTC
This post was generated by a robot! Send all complaints to epsy.
submitted by tabledresser to tabled [link] [comments]

Submitted this to /r/BitcoinMining and despite messaging the mod, it was never Approved. Asking here: With cheap electricity and rising BTC to USD values, are FPGAs still worth it?

Since the original was posted 8 days ago, the prices are way outta date.
I saw a sale of an FPGA on bitmit:
https://www.bitmit.net/en/item/19307-x6500-bitcoin-miners-working-100-400mh-s-for-less-than-20w#
I'm ignoring the one on eBay at the moment (it's ending in 3 hours anyway).
Basically, if your electricity ranged from 0.06-0.09/kwh (and got cheaper the more you used), would an x6500 be worth it at ~$229USD? Or is 400MH/s so low that you won't hit anything?
Side question: if you don't join a pool, are you still guaranteed to eventually mine a block, or will your chances go down so much that you'd just be burning electricity? Is it better to get a quad or so and make your own "pool" that way, or do they all work together to form one super-unit?
(I assume buying sooner is better as the value of BTC goes up, so will the difficulty)
submitted by Miner_Throwaway to mtred [link] [comments]

Looking to buy an ASIC miner, which one would you all recommend?

I used to GPU mine on my GTX480. It was never particularly lucrative, I just generally enjoyed it. Now I am looking to buy a USB miner (or a couple?) that will mine at a reasonable bitrate. I know that it probably won't be profitable (though with the extremely variable price of BTC, who knows), but I'm just wondering which one I should get. Looking to spend less than a hundred dollars (I also wouldn't mind getting a couple cheap USB units instead). What seller will deliver a functional product within a reasonable time frame? I don't want to wait 3 months for it to arrive, and I don't want to get a unit with horrible design flaws, as many seem to be getting.
TLDR; Looking to buy a cheapish ASIC miner that will ship soon and will not break. What should I get?
Thanks!
Edit: Are these any good? http://www.ebay.com/itm/NEW-ASIC-USB-Erupter-Miner-USA-Seller-Bitcoin-V3-BTC-fpga-gpu-bfl-avalon-/200975270201?_trksid=p2054897.l4275
submitted by bobthedeadly to BitcoinMining [link] [comments]

BitCoin Mining FPGA Card FPGA Mining Is Back! Crushes GPU Mining with $20-57 a Day ... Inside a Bitcoin mine that earns $70K a day - YouTube $500 Mining - Worth It? Raspberry Pi 4 Bitcoin Mining For 24 Hours! - YouTube

3. Best Bitcoin mining software CGminer. Pros: Supports GPU/FPGA/ASIC mining, Popular (frequently updated). Cons: Textual interface. Platforms: Windows, Mac, Linux Going strong for many years, CGminer is still one of the most popular GPU/FPGA/ASIC mining software available. CGminer is a command line application written in C. It’s also cross platform, meaning you can use it with Windows ... Hobby Bitcoin mining can still be fun and even profitable if you have cheap electricity and get the best and most efficient Bitcoin mining hardware. Bitcoin mining is competitive. It’s not ideal for the average person to mine since China’s cheap electricity has allowed it to dominate the mining market . Bitcoin is Secure. Bitcoin miners help keep the Bitcoin network secure by approving transactions. Mining is an important and integral part of Bitcoin that ensures fairness while keeping the Bitcoin network stable, safe and secure. Links. We Use Coins - Learn all about crypto-currency. Bitcoin News - Where the Bitcoin community gets news. ASIC Virtual Currency Miners for Bitcoin. Make an Offer. Xilinx PCIe FPGA BCU1525 64GB DDR4 Mining FPGA Board VU9P $2,000.00 +$8.25 shipping. Make Offer - Xilinx PCIe FPGA BCU1525 64GB DDR4 Mining FPGA Board VU9P ... Shark Mining, a hardware company founded in Silicon Valley, California and based in Miami, Florida presents Bitcoin Mining Hardware Store. Bitcoin, Ethereum, ZCash Professional mining rigs. 4 GPU, 6 GPU, 8 GPU AMD RX580, RAdeon VII miners . Complete bitcoin miners and altcoin Ethereum, Zcash mining rigs, setup and ready to go with the operating ...

[index] [1241] [4397] [4366] [3367] [4235] [1333] [1988] [3979] [2032] [4733]

BitCoin Mining FPGA Card

If you're looking to get into FPGA crypto mining for less than $200 dollars, you may find the Hashaltcoin Blackminer F1 Mini very interesting, let's review t... BitCoin Mining FPGA Card - Duration: 4:06. CarlsTechShed 97,578 views. 4:06. The Outlook on Cryptocurrency Mining ... Bitcoin Mining with FPGAs (EC551 Final Project) - Duration: 6:11. I picked up a few PCI FPGA Cards on eBay for 99p which, apparently, can mine BitCoins at a speed of 21 Ghash/s (once they're correctly configured!) FPGA Mining Is Back! Crushes GPU Mining with $20-57 a Day per Card - Duration: 12:08. Crypto Currency 101,123 views. 12:08. antminer D3 hash board repair - Duration: 39:35. The virtual goldrush to mine Bitcoin and other cryptocurrencies leads us to Central Washington state where a Bitcoin mine generates roughly $70,000 a day min...

https://liehamcacu.tk