All convolutions in a very dense block are ReLU-activated and use batch normalization. Channel-clever concatenation is only attainable if the peak and width dimensions of the data stay unchanged, so convolutions in a dense block are all of stride one. Pooling layers are inserted between dense blocks for further https://financefeeds.com/nola-presale-spikes-tops-111000-tokens-sold-after-dogecoin-news/