tag:blogger.com,1999:blog-6141980.post7279827605407880487..comments2018-12-18T13:58:47.832-06:00Comments on Nuit Blanche: Video, Preprint and Implementation: Measuring the Intrinsic Dimension of Objective LandscapesIgorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-6141980.post-11114875786006156992018-05-01T09:56:17.069-05:002018-05-01T09:56:17.069-05:00Just to say where the variable precision comes fro...Just to say where the variable precision comes from. The Walsh Hadamard transform maps points to certain patterns. And can map those certain patterns back to the original points again, since it is self inverse.<br />With random projections based on the WHT you get similar properties except they are not generally self inverse, you have to construct an inverse if you need it.<br />Anyway certain patterns can focus to a point in the output allowing to set whatever value you like there. With incomplete patterns (doing a dimension increase) you can only focus specific values in some places in the output and other places only a low precision approximation of the wanted value.<br />Anyway it is all just the linear algebra of under-determined systems.<br /><br />You could put a non-linearity at the output of the random projection after the dimension increase. Using say a signed square function y=x*x x>0, y=-x*x x<=0 you would get somewhat sparse synthesized weights with a spiky type of distribution. Or the signed square root (similar idea) has attractor states of +1,1 if you think the synthesized weights should have a soft binary type distribution.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6141980.post-54042577408286788732018-05-01T00:48:48.592-05:002018-05-01T00:48:48.592-05:00If what they are doing is variable precision weigh...If what they are doing is variable precision weight sharing:<br /><br />https://randomprojectionai.blogspot.com/2018/02/neural-network-weight-sharing-using.html<br /><br />Then it would be very interesting to figure out an algorithm to know which weights were being set very exactly to high precision and which ones to low precision. You can think of schemes where you alternate the backpropagation through 2 different random projections then look later where they agree to high precision on some weights and disagree on others.<br />I presume other people think it is a very significant paper? Anonymousnoreply@blogger.com