{"id":102019,"date":"2019-07-09T07:49:00","date_gmt":"2019-07-09T07:49:00","guid":{"rendered":"http:\/\/ahay.org\/blog\/?p=102019"},"modified":"2019-09-09T20:54:06","modified_gmt":"2019-09-09T20:54:06","slug":"program-of-the-month-sflpf","status":"publish","type":"post","link":"https:\/\/ahay.org\/blog\/2019\/07\/09\/program-of-the-month-sflpf\/","title":{"rendered":"Program of the month: sflpf"},"content":{"rendered":"<p><a href=\"\/RSF\/sflpf.html\">sfslpf<\/a> estimates a non-stationary filter using shaping regularization.<\/p>\n<p>The method is described in the reproducible paper<a href=\"\/RSF\/book\/tccs\/lpf\/paper_html\/\"> Adaptive multiple subtraction using regularized nonstationary regressio<\/a><\/p>\n<p>The following example from <a href=\"\/RSF\/book\/tccs\/lpf\/plut.html\">tccs\/lpf\/plut<\/a> shows a common-offset section from the Pluto synthetic dataset before and after adaptive multiple subtraction with the help of <strong>sflpf<\/strong>.<\/p>\n<p><img decoding=\"async\" src=\"\/RSF\/book\/tccs\/lpf\/plut\/Fig\/ref.png\" alt=\"\" title=\"\"> <img decoding=\"async\" src=\"\/RSF\/book\/tccs\/lpf\/plut\/Fig\/sig.png\" alt=\"\" title=\"\"><\/p>\n<p>Given target data $m(\\mathbf{x})$ (specified with <strong>match=<\/strong> parameter) and a collection of fitting functions $s_k(\\mathbf{x})$ (specified in the standard input), <strong>sflpf<\/strong> finds the fitting coefficients $b_k(\\mathbf{x})$ by minimizing the error<\/p>\n<p>$m(\\mathbf{x}) &#8211; \\displaystyle \\sum_{k=1}^{N} b_k(\\mathbf{x})\\,s_k(\\mathbf{x})$<\/p>\n<p>while constraining the coefficients to be smooth. The smoothness is controlled by  <strong>rect#=<\/strong> parameters, as in <a href=\"\/blog\/2012\/01\/01\/program-of-the-month-sfsmooth\/\">sfsmooth<\/a>.<\/p>\n<p>Shaping regularization is carried out iteratively, <strong>niter=<\/strong> controls the number of iterations.<\/p>\n<p>The mean coefficient from the example above is shown in the figure below.<\/p>\n<p><img decoding=\"async\" src=\"\/RSF\/book\/tccs\/lpf\/plut\/Fig\/csum.png\" alt=\"\" title=\"\"><\/p>\n<p>Optionally, a prediction-error filter can be applied to whiten the residual. The filter is specified with the help of <strong>pef=<\/strong> and <strong>lag=<\/strong> parameters, with a multidimensional helical filter specified as in <a href=\"\/blog\/2014\/05\/13\/program-of-the-month-sfhelicon\/\">sfhelicon<\/a>.<\/p>\n<p>The complex version of the same program is <a href=\"\/RSF\/sfclpf.html\">sfclpf<\/a>.<\/p>\n<h3 id=\"10previousprogramsofthemonth\">10 previous programs of the month:<\/h3>\n<ul>\n<li><a href=\"\/blog\/2019\/06\/12\/program-of-the-month-sfslice\/\">sfslice<\/a><\/li>\n<li><a href=\"\/blog\/2019\/06\/06\/program-of-the-month-sfzomig3\/\">sfzomig3<\/a><\/li>\n<li><a href=\"\/blog\/2017\/04\/19\/program-of-the-month-sfseislet\/\">sfseislet<\/a><\/li>\n<li><a href=\"\/blog\/2016\/02\/18\/program-of-the-month-sfmig2\/\">sfmig2<\/a><\/li>\n<li><a href=\"\/blog\/2016\/01\/16\/program-of-the-month-sfsort\/\">sfsort<\/a><\/li>\n<li><a href=\"\/blog\/2015\/12\/22\/program-of-the-month-sfdivn\/\">sfdivn<\/a><\/li>\n<li><a href=\"\/blog\/2015\/11\/16\/program-of-the-month-sfpldb-and-sfplas\/\">sfpldb and sfplas<\/a><\/li>\n<li><a href=\"\/blog\/2015\/10\/15\/program-of-the-month-sfisolr2\/\">sfisolr2<\/a><\/li>\n<li><a href=\"\/blog\/2015\/09\/14\/program-of-the-month-sfsimilarity\/\">sfsimilarity<\/a><\/li>\n<li><a href=\"\/blog\/2015\/07\/10\/program-of-the-month-sfmutter\/\">sfmutter<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>sfslpf estimates a non-stationary filter using shaping regularization. The method is described in the reproducible paper Adaptive multiple subtraction using regularized nonstationary regressio The following example from tccs\/lpf\/plut shows a common-offset section from the Pluto synthetic dataset before and after adaptive multiple subtraction with the help of sflpf. Given target data $m(\\mathbf{x})$ (specified with match= [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_import_markdown_pro_load_document_selector":0,"_import_markdown_pro_submit_text_textarea":"","activitypub_content_warning":"","activitypub_content_visibility":"local","activitypub_max_image_attachments":4,"activitypub_interaction_policy_quote":"","footnotes":""},"categories":[3],"tags":[],"class_list":["post-102019","post","type-post","status-publish","format-standard","hentry","category-programs"],"_links":{"self":[{"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/posts\/102019","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/comments?post=102019"}],"version-history":[{"count":10,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/posts\/102019\/revisions"}],"predecessor-version":[{"id":102317,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/posts\/102019\/revisions\/102317"}],"wp:attachment":[{"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/media?parent=102019"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/categories?post=102019"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ahay.org\/blog\/wp-json\/wp\/v2\/tags?post=102019"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}