ImageVerifierCode 换一换
格式:PPT , 页数:41 ,大小:2.57MB ,
资源ID:13242180      下载积分:10 金币
快捷注册下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

开通VIP
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.zixin.com.cn/docdown/13242180.html】到电脑端继续下载(重复下载【60天内】不扣币)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

开通VIP折扣优惠下载文档

            查看会员权益                  [ 下载后找不到文档?]

填表反馈(24小时):  下载求助     关注领币    退款申请

开具发票请登录PC端进行申请

   平台协调中心        【在线客服】        免费申请共赢上传

权利声明

1、咨信平台为文档C2C交易模式,即用户上传的文档直接被用户下载,收益归上传人(含作者)所有;本站仅是提供信息存储空间和展示预览,仅对用户上传内容的表现方式做保护处理,对上载内容不做任何修改或编辑。所展示的作品文档包括内容和图片全部来源于网络用户和作者上传投稿,我们不确定上传用户享有完全著作权,根据《信息网络传播权保护条例》,如果侵犯了您的版权、权益或隐私,请联系我们,核实后会尽快下架及时删除,并可随时和客服了解处理情况,尊重保护知识产权我们共同努力。
2、文档的总页数、文档格式和文档大小以系统显示为准(内容中显示的页数不一定正确),网站客服只以系统显示的页数、文件格式、文档大小作为仲裁依据,个别因单元格分列造成显示页码不一将协商解决,平台无法对文档的真实性、完整性、权威性、准确性、专业性及其观点立场做任何保证或承诺,下载前须认真查看,确认无误后再购买,务必慎重购买;若有违法违纪将进行移交司法处理,若涉侵权平台将进行基本处罚并下架。
3、本站所有内容均由用户上传,付费前请自行鉴别,如您付费,意味着您已接受本站规则且自行承担风险,本站不进行额外附加服务,虚拟产品一经售出概不退款(未进行购买下载可退充值款),文档一经付费(服务费)、不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
4、如你看到网页展示的文档有www.zixin.com.cn水印,是因预览和防盗链等技术需要对页面进行转换压缩成图而已,我们并不对上传的文档进行任何编辑或修改,文档下载后都不会有水印标识(原文档上传前个别存留的除外),下载后原文更清晰;试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓;PPT和DOC文档可被视为“模板”,允许上传人保留章节、目录结构的情况下删减部份的内容;PDF文档不管是原文档转换或图片扫描而得,本站不作要求视为允许,下载前可先查看【教您几个在下载文档中可以更好的避免被坑】。
5、本文档所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用;网站提供的党政主题相关内容(国旗、国徽、党徽--等)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。
6、文档遇到问题,请及时联系平台进行协调解决,联系【微信客服】、【QQ客服】,若有其他问题请点击或扫码反馈【服务填表】;文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“【版权申诉】”,意见反馈和侵权处理邮箱:1219186828@qq.com;也可以拔打客服电话:0574-28810668;投诉电话:18658249818。

注意事项

本文(稀疏表示ppt-Michael_Elad.ppt)为本站上传会员【pc****0】主动上传,咨信网仅是提供信息存储空间和展示预览,仅对用户上传内容的表现方式做保护处理,对上载内容不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知咨信网(发送邮件至1219186828@qq.com、拔打电话4009-655-100或【 微信客服】、【 QQ客服】),核实后会尽快下架及时删除,并可随时和客服了解处理情况,尊重保护知识产权我们共同努力。
温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载【60天内】不扣币。 服务填表

稀疏表示ppt-Michael_Elad.ppt

1、Click to edit Master title style,Click to edit Master text styles,Second level,Third level,Fourth level,Fifth level,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,*,MMSE Estimation for Sparse Representation Modeling,Michael Elad,The Computer Science Department,The Technion Israel

2、 Institute of technology,Haifa 32000,Israel,*,Joint work with,Irad Yavneh&Matan Protter,The CS Department,The Technion,April 6,th,2009,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,2,Noise Removal?,In this talk we focus on signal/image denoising,Important:,(i)Practical applicati

3、on;(ii)A convenient platform for testing basic ideas in signal/image processing.,Many Considered Directions:,Partial differential equations,Statistical estimators,Adaptive filters,Inverse problems®ularization,Wavelets,Example-based techniques,Sparse representations,Main Massage Today:,Several spa

4、rse representations can be found and used for better denoising performance we introduce,motivate,discuss,demonstrate,and explain this new idea.,Remove Additive Noise,?,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,3,Background on Denoising with Sparse Representations,Using More

5、than One Representation:Intuition,Using More than One Representation:Theory,A Closer Look At the Unitary Case,Summary and Conclusions,Agenda,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,4,Part I,Background on Denoising with Sparse Representations,MMSE Estimation for Sparse Repr

6、esentation Modeling,By:Michael Elad,5,Relation to measurements,Denoising By Energy Minimization,Thomas,Bayes 1702-1761,Prior or regularization,y,:Given measurements,x,:Unknown to be recovered,Many of the proposed signal denoising algorithms are related to the minimization of an energy function of th

7、e form,This is in-fact a Bayesian point of view,adopting the Maximum-A-posteriori Probability(MAP)estimation.,Clearly,the wisdom in such an approach is within the choice of the prior,modeling the signals,of interest.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,6,Sparse Represe

8、ntation Modeling,M,K,N,A fixed Dictionary,Every column in,D,(,dictionary,)is a prototype signal(,atom,).,The vector,is generated randomly with few(say L for now)non-zeros at random locations and with random values.,A sparse&random vector,N,MMSE Estimation for Sparse Representation Modeling,By:Michae

9、l Elad,7,D,-y,=-,Back to Our MAP Energy Function,The L,0,“norm”is effectively counting the number of non-zeros in,.,The vector,is the representation(,sparse,/,redundant,).,Bottom line:Denoising of,y,is done by minimizing,or,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,8,Next st

10、eps:given the previously found atoms,find the next,one,to,best fit,the residual.,The algorithm stops when the error is below the destination threshold.,The MP,is one of the greedy algorithms that finds one atom at a time,Mallat&Zhang(93),.,Step 1:find the one atom that,best matches,the signal.,The O

11、rthogonal MP(OMP)is an improved version that re-evaluates the coefficients by Least-Squares after each round.,The Solver We Use:Greed Based,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,9,Orthogonal Matching Pursuit,Initialization,Main Iteration,1.,2.,3.,4.,5.,Stop,Yes,No,OMP fi

12、nds one atom at a time for approximating the solution of,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,10,Part II,Using More than One Representation:Intuition,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,11,Back to the Beginning.What If,Consider the denoisi

13、ng problem,and suppose that we can find a group of J candidate solutions,such that,Basic Questions:,What,could we do with such a set of competing solutions in order to better denoise,y,?,Why,should this help?,How,shall we practically find such a set of solutions?,Relevant work:,Larsson&Selen(07),Sch

14、intter et.al.(08),Elad and Yavneh(08),MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,12,Motivation General,Why bother with such a set?,Because each representation conveys a different story about the desired signal.,Because pursuit algorithms are often wrong in finding the sparses

15、t representation,and then relying on their solution is too sensitive.,Maybe there are“deeper”reasons?,D,D,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,13,Our Motivation,An intriguing relationship between this idea and the common-practice in example-based techniques,where severa

16、l examples are merged.,Consider the Non-Local-Means,Buades,Coll,&Morel(05),.It uses (i)a,local dictionary,(the neighborhood patches),(ii)it builds,several sparse representations,(of cardinality 1),and (iii)it,merges,them.,Why not take it further,and use general sparse representations?,D,D,MMSE Estim

17、ation for Sparse Representation Modeling,By:Michael Elad,14,Generating Many Representations,Our Answer:Randomizing the OMP,Initialization,Main Iteration,1.,2.,3.,4.,5.,Stop,Yes,No,*Larsson and Schnitter propose a more complicated and deterministic tree pruning method,*,For now,lets set the parameter

18、 c manually for best performance.Later we shall define a way to set it automatically,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,15,Lets Try,Proposed Experiment:,Form a random dictionary,D,.,Multiply by a sparse vector,0,().,Add Gaussian iid noise,v,with,=1 and obtain .,Solve

19、the problem,using OMP,and obtain .,Use Random-OMP and obtain .,Lets look at the obtained representations,100,200,D,+,=,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,16,Some Observations,We see that,The OMP gives the sparsest solution,Nevertheless,it is not the most effective for

20、 denoising.,The cardinality of a representation does not reveal its efficiency.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,17,The Surprise(at least for us),Lets propose the average,as our representation,This representation,IS NOT SPARSE AT ALL,but it gives,MMSE Estimation for

21、 Sparse Representation Modeling,By:Michael Elad,18,Is It Consistent?Yes!,Here are the results of 1000 trials with the same parameters,?,Cases of zero solution,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,19,Part III,Using More than One Representation:Theory,MMSE Estimation for

22、Sparse Representation Modeling,By:Michael Elad,20,Our Signal Model,K,N,A fixed Dictionary,D,is fixed and known.,The vector,is built by:,Choosing the support s with probability P(s)from all the 2,K,possibilities,.,For simplicity,assume that|s|=k is fixed and known.,Choosing the,s,coefficients using i

23、id Gaussian entries N(0,x,).,The ideal signal is,x,=,D,=,D,s,s,.,The p.d.f.P(,)and P(,x,)are clear and known,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,21,Adding Noise,K,N,A fixed Dictionary,+,Noise Assumed:,The noise,v,is additive white Gaussian vector with probability P,v,(

24、v,),The conditional p.d.f.s P(y|s),P(s|y),and even also P(,x,|y)are all clear and well-defined,(although they may appear nasty).,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,22,The Key The Posterior P(,x,|y),We have access to,MAP,MMSE,The estimation of,and multiplication by,D,

25、is equivalent to the above.,These two estimators are impossible to compute,as we show next.,Oracle,known support s,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,23,Lets Start with The Oracle,*When s is known,*,Comments:,This estimate is both the MAP and MMSE.,The oracle estimate

26、 of,x,is obtained by multiplication by,D,s,.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,24,We have seen this as the oracles probability for the support s:,The MAP Estimation,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,25,The MAP Estimation,Implications:

27、The MAP estimator requires to test all the possible supports for the maximization.In typical problems,this is impossible as,there is a combinatorial set of possibilities.,This is why we rarely use exact MAP,and we typically replace it with approximation algorithms(e.g.,OMP).,MMSE Estimation for Spa

28、rse Representation Modeling,By:Michael Elad,26,This is the oracle for s,as we have seen before,The MMSE Estimation,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,27,The MMSE Estimation,Implications:,The best estimator(in terms of L,2,error)is a weighted average of,many sparse rep

29、resentations,!,As in the MAP case,in typical problems one cannot compute this expression,as,the summation is over a combinatorial set of possibilities.We should propose approximations here as well.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,28,This is our c in the Random-OMP,

30、The Case of|s|=k=1,The k-th atom in,D,Based on this we can propose a greedy algorithm for both MAP and MMSE:,MAP,choose the atom with the largest inner product(out of K),and do so one at a time,while freezing the previous ones(almost OMP).,MMSE,draw at random an atom in a greedy algorithm,based on t

31、he above probability set,getting close to P(s|,y,)in the overall draw.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,29,Bottom Line,The MMSE estimation we got requires a sweep through all supports(binatorial search)impractical.,Similarly,an explicit expression for P(,x,/,y,)can

32、be derived and maximized this is the MAP estimation,and it also requires a sweep through all possible supports impractical too.,The OMP is a(good)approximation for the MAP estimate.,The Random-OMP is a(good)approximation of the Minimum-Mean-Squared-Error(MMSE)estimate.It is close to the Gibbs sample

33、r of the probability P(s|,y,)from which we should draw the weights.,Back to the beginning:Why Use Several Representations?,Because their average leads to a provable better noise suppression.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,30,0.5,1,1.5,s,0,0.05,0.1,0.15,0.2,0.25,0.

34、3,0.35,0.4,0.45,0.5,Relative Mean-Squared-Error,Comparative Results,The following results correspond to a small dictionary(2030),where the combinatorial formulas can be evaluated as well.,Parameters:,N=20,K=30,True support=3,x,=1,J=10(RandOMP),Averaged over 1000 experiments,2,0,1.Emp.Oracle,2.Theor.

35、Oracle,3.Emp.MMSE,4.Theor.MMSE,5.Emp.MAP,6.Theor.MAP,7.OMP,8.RandOMP,Known support,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,31,Part IV,A Closer Look At the Unitary Case,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,32,Few Basic Observations,Let us denot

36、e,(The Oracle),MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,33,Back to the MAP Estimation,We assume|s|=k fixed with equal probabilities,*,This part becomes a constant,and thus can be discarded,This means that MAP estimation can be easily evaluated by computing,sorting its entri

37、es in descending order,and choosing the k leading ones.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,34,Closed-Form Estimation,It is well-known that MAP enjoys a closed form and simple solution in the case of a unitary dictionary,D,.,This closed-form solution takes the structur

38、e of thresholding or shrinkage.The specific structure depends on the fine details of the model assumed.,It is also known that OMP in this case becomes exact.,What about the MMSE?Could it have a simple closed-form solution too?,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,35,The

39、 MMSE Again,This is the formula we got:,=,We combine linearly many sparse representations(with proper weights),+,+,+,+,+,+,+,The result is one effective representation(not sparse anymore),MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,36,The MMSE Again,This is the formula we got:

40、We change the above summation to,where there are K contributions(one per each atom)to be found and used.,We have developed a closed-form recursive formula for computing the q coefficients.,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,37,Towards a Recursive Formula,We have seen

41、 that the governing probability for the weighted averaging is given by,Indicator function stating if j is in s,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,38,The Recursive Formula,where,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,39,0.5,1,1.5,s,0,0.07,0.

42、08,0.09,0.1,Relative Mean-Squared-Error,This is a synthetic experiment resembling the previous one,but with few important changes:,2,Oracle,Known support,An Example,D,is unitary,The representations cardinality is 5(the higher it is,the weaker the Random-OMP becomes),Dimensions are different:N=K=64,J

43、20(RandOMP runs),Theor.MAP,OMP,Recursive MMSE,Theor.MMSE,RandOMP,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,40,Part V,Summary and Conclusions,MMSE Estimation for Sparse Representation Modeling,By:Michael Elad,41,Today We Have Seen that,By finding the sparsest representation

44、and using it to recover the clean signal,How?,Sparsity,and,Redundancy,are used for denoising of signals/images,Can we do better?,Today we have shown that averaging several sparse representations for a signal lead to better denoising,as it approximates the MMSE estimator.,More on these(including the slides and the relevant papers)can be found in,www.cs.technion.ac.il/elad,

移动网页_全站_页脚广告1

关于我们      便捷服务       自信AI       AI导航        抽奖活动

©2010-2026 宁波自信网络信息技术有限公司  版权所有

客服电话:0574-28810668  投诉电话:18658249818

gongan.png浙公网安备33021202000488号   

icp.png浙ICP备2021020529号-1  |  浙B2-20240490  

关注我们 :微信公众号    抖音    微博    LOFTER 

客服