BTC Sell-Side Liquidity Focused Around $96,000, Monthly Close Poised for Best April Gain in Nearly Four Years
On April 30th, according to Cointelegraph, the TradingView data indicates that in the monthly closing and a few hours prior to the release of key U.S. macroeconomic data, Bitcoin remained in a narrow range of fluctuations. The market is closely observing the first-quarter GDP data and the March Personal Consumption Expenditures Price Index (PCE), which is regarded as the Federal Reserve's preferred inflation indicator. The trading information platform Kobeissi Letter pointed out that the market consensus predicts a GDP contraction: "All indications suggest that this will be the first quarterly GDP contraction in the United States since the second quarter of 2022." Despite the potential for significant volatility in risk assets, Bitcoin traders still believe that the price will resume its upward trend.
Renowned trader Cold Blooded Shiller stated: "The next 24 hours will be a crucial moment for BTC and the S&P 500 Index, with Bitcoin having the chance to break through resistance and rea
3 minutes ago
FIFA is set to launch "FIFA Blockchain," and "FIFA+Collect" will migrate to the new chain
On April 30th, according to market intelligence, FIFA is scheduled to initiate an EVM chain named the "FIFA Blockchain." Its NFT marketplace, "FIFA+Collect," will be migrated to the new chain.
3 minutes ago
CZ: Consulting on Establishing Cryptocurrency Reserves for Multiple Countries
On April 30th, CZ today expressed his viewpoints on the "Progress of the Global Crypto Reserve Strategy" on the TOKEN2049 stage in Dubai. He said, "We are consulting with multiple countries outside of Europe and guiding them on how to establish a cryptocurrency reserve similar to that of the United States."
CZ also mentioned the reasons for investing in the X platform. "What I am most interested in is free capital. However, to obtain it, you need freedom of speech, which is fundamental."
3 minutes ago
The whale that spent 1001 BNB to heavily invest in AIOT has now realized a profit of $755,000, with a cost of $0.07476 per token.
April 30th: According to the monitoring of ai_9684xtpa, Binance announced the launch of AIOT contracts, which led to a brief surge of nearly 120% in the coin price. Previously on April 25th, a smart money investor who placed a heavy position in AIOT with 1001 BNB now has a paper profit of $755,000. The cost per token was $0.07476 and the current price is $0.2424, resulting in a return on investment of 178%.
Earlier reports showed that on April 25th, this whale withdrew 1001 BNB from Binance. Then, after opening positions, it spent 698 BNB (worth $424,000) and continuously bought 5.68 million AIOT tokens at an average price of $0.07476, becoming the top holder directly and holding 14.2% of the circulating supply.
3 minutes ago
CZ: The success of Bitcoin will ultimately drive other coins, and we are still in the early stages
On April 30th, CZ stated on the TOKEN2049 stage in Dubai today, "Up to now, the focus of this cycle has been on ETFs. And nearly all of it is Bitcoin. Ethereum has not achieved as much success yet. However, Bitcoin's success will ultimately drive other currencies. But this requires time. I believe we are still in the early stages."
3 minutes ago
DeepSeek has released the Prover-V2 model, which has 671 billion parameters.
On April 30th, DeepSeek released a new model named DeepSeek-Prover-V2-671B on the AI open-source community Hugging Face today. It is reported that DeepSeek-Prover-V2-671B employs a more efficient safetensors file format and supports multiple calculation precisions, which makes it easier to train and deploy the model more quickly and with more resource efficiency. With 671 billion parameters, it is either an upgrade to last year's Prover-V1.5 mathematical model. In terms of the model architecture, the model uses the DeepSeek-V3 architecture, adopts a Mixture of Experts (MoE) mode, has 61 Transformer layers, and a 7168-dimensional hidden layer. It also supports ultra-long contexts, with a maximum position embedding of 163,800, enabling it to handle complex mathematical proofs. Additionally, it uses FP8 quantization to reduce the model size through quantization techniques and improve inference efficiency. (Jinshi)
3 minutes ago