Updated on 2026/04/17

写真a

 
KUME KEITA
 
Organization
School of Engineering Assistant Professor
Title
Assistant Professor
External link

Degree

  • Doctor of Engineering ( 2024.3   Tokyo Institute of Technology )

Research Interests

  • Signal processing

  • nonsmooth optimization

  • Cayley transform

  • Stiefel manifold

  • nonconvex optimization

Research Areas

  • Natural Science / Applied mathematics and statistics  / 連続最適化

  • Informatics / Soft computing  / signal processing

Education

  • Tokyo Institute of Technology   School of Engineering

    2021.4 - 2024.3

      More details

  • Tokyo Institute of Technology   School of Engineering

    2019.4 - 2021.3

      More details

  • Tokyo Institute of Technology   School of Engineering

    2015.4 - 2019.3

      More details

  • 北海道札幌南高等学校

    2012.4 - 2015.3

      More details

Research History

  • Institute of Science Tokyo   Dept. of Information and Communications Engineering, School of Engineering   Assistant Professor

    2024.10

      More details

    Country:Japan

    researchmap

  • Tokyo Institute of Technology   Dept. of Information and Communications Engineering, School of Engineering   Assistant Professor

    2024.4 - 2024.9

      More details

    Country:Japan

    researchmap

  • 日本学術振興会   日本学術振興会特別研究員 (DC1)

    2021.4 - 2024.3

      More details

Professional Memberships

Committee Memberships

  • 日本オペレーションズ・リサーチ学会研究部会「最適化のモデリングとアルゴリズム(MATCHA)」   幹事  

    2026.3   

      More details

    Committee type:Academic society

    researchmap

  • 日本オペレーションズ・リサーチ学会   庶務幹事  

    2025.4   

      More details

Papers

▼display all

MISC

  • A DC Composite Optimization via Variable Smoothing for Robust Phase Retrieval with Nonconvex Loss Functions

    Kumataro Yazawa, Keita Kume, Isao Yamada

    2026.4

     More details

    In this paper, we propose an optimization-based method for robust phase retrieval problem where the goal is to estimate an unknown signal from a quadratic measurement corrupted by outliers. To enhance the robustness of existing optimization models with the $\ell_1$ loss function, we propose a generalized model that can handle DC (Difference-of-Convex) loss functions beyond the $\ell_1$ loss. We view the cost function of the proposed model as a composition of a DC function with a smooth mapping, and develop a variable smoothing algorithm for minimizing such DC composite functions. At each step of our algorithm, we generate a smooth surrogate function by using the Moreau envelope of each (weakly) convex function in the DC function, and then perform the gradient descent update of the surrogate function. Unlike many existing algorithms for DC problems, the proposed algorithm does not require any inner loop. We also present a convergence analysis in terms of a DC composite critical point for the proposed algorithm. Our numerical experiment demonstrates that the proposed method with DC loss functions is more robust against outliers compared to existing methods with the $\ell_1$ loss.

    arXiv

    researchmap

    Other Link: https://arxiv.org/pdf/2604.07686v2

  • Linearly involved Generalized Moreau Enhanced Model with Non-quadratic Smooth Convex Data Fidelity Functions

    Wataru Yata, Keita Kume, Isao Yamada

    2025.9

     More details

    In this paper, we introduce an overall convex model incorporating a nonconvex regularizer. The proposed model is designed by extending the least squares term in the constrained LiGME model [Yata Yamagishi Yamada 2022] to fairly general smooth convex functions for flexible utilization of non-quadratic data fidelity functions. Under an overall convexity condition for the proposed model, we present sufficient conditions for the existence of a minimizer of the proposed model and an inner-loop free algorithm with guaranteed convergence to a global minimizer of the proposed model. To demonstrate the effectiveness of the proposed model and algorithm, we conduct numerical experiments in scenarios of Poisson denoising problem and simultaneous declipping and denoising problem.

    arXiv

    researchmap

    Other Link: https://arxiv.org/pdf/2509.03258v2

  • A Proximal Variable Smoothing for Minimization of Nonlinearly Composite Nonsmooth Function -- Maxmin Dispersion and MIMO Applications

    Keita Kume, Isao Yamada

    2025.6

     More details

    Authorship:Lead author, Corresponding author   Language:English  

    DOI: 10.48550/arXiv.2506.05974

    researchmap

  • A Variable Smoothing for Weakly Convex Composite Minimization with Nonconvex Constraint

    Keita Kume, Isao Yamada

    2024.12

     More details

    In this paper, we address a nonconvexly constrained nonsmooth optimization
    problem involving the composition of a weakly convex function and a smooth
    mapping. To find a stationary point of the target problem, we propose a
    variable smoothing-type algorithm by combining the ideas of (i) translating the
    constrained problem into a Euclidean optimization problem with a smooth
    parametrization of the constraint set; (ii) exploiting a sequence of smoothed
    surrogate functions, of the cost function, given with the Moreau envelope of a
    weakly convex function. The proposed algorithm produces a vector sequence by
    the gradient descent update of a smoothed surrogate function at each iteration.
    In a case where the proximity operator of the weakly convex function is
    available, the proposed algorithm does not require any iterative solver for
    subproblems therein. By leveraging tools in the variational analysis, we show
    the so-called {\em gradient consistency property}, which is a key ingredient
    for smoothing-type algorithms, of the smoothed surrogate function used in this
    paper. Based on the gradient consistency property, we also establish an
    asymptotic convergence analysis for the proposed algorithm. Numerical
    experiments demonstrate the efficacy of the proposed algorithm.

    arXiv

    researchmap

    Other Link: http://arxiv.org/pdf/2412.04225v2

  • Adaptive Localized Cayley Parametrization for Optimization over Stiefel Manifold

    Keita Kume, Isao Yamada

    2023.5

     More details

    We present an adaptive parametrization strategy for optimization problems
    over the Stiefel manifold by using generalized Cayley transforms to utilize
    powerful Euclidean optimization algorithms efficiently. The generalized Cayley
    transform can translate an open dense subset of the Stiefel manifold into a
    vector space, and the open dense subset is determined according to a tunable
    parameter called a center point. With the generalized Cayley transform, we
    recently proposed the naive Cayley parametrization, which reformulates the
    optimization problem over the Stiefel manifold as that over the vector space.
    Although this reformulation enables us to transplant powerful Euclidean
    optimization algorithms, their convergences may become slow by a poor choice of
    center points. To avoid such a slow convergence, in this paper, we propose to
    estimate adaptively 'good' center points so that the reformulated problem can
    be solved faster. We also present a unified convergence analysis, regarding the
    gradient, in cases where fairly standard Euclidean optimization algorithms are
    employed in the proposed adaptive parametrization strategy. Numerical
    experiments demonstrate that (i) the proposed strategy succeeds in escaping
    from the slow convergence observed in the naive Cayley parametrization
    strategy; (ii) the proposed strategy outperforms the standard strategy which
    employs a retraction.

    arXiv

    researchmap

    Other Link: http://arxiv.org/pdf/2305.17901v1

Presentations

  • On optimization over Stiefel manifold based on adaptive Cayley parametrization Invited

    Keita Kume

    IEICE SIP  2023.8 

     More details

    Event date: 2023.8

    Presentation type:Oral presentation (invited, special)  

    researchmap

  • Cayley parametrization strategy for optimization over the Stiefel manifold Invited

    Keita Kume

    10th International Congress on Industrial and Applied Mathematics (ICIAM 2023)  2023.8 

     More details

    Presentation type:Oral presentation (invited, special)  

    researchmap

  • 弱凸関数と可微分写像からなる合成関数の最適化のための可変平滑化法と信号処理応用 Invited

    久米啓太

    最適化の理論とアルゴリズム(RAOTA)第8回研究会  2025.1 

     More details

    Language:Japanese   Presentation type:Oral presentation (invited, special)  

    File: KumeRAOTA2025_Web3.pdf

    researchmap

  • A Proximal Variable Smoothing for Nonsmooth Minimization of the Sum of Three Functions Including Weakly Convex Composite Function Invited

    Keita Kume, Isao Yamada

    The 22nd EUROPT Conference on Advances in Continuous Optimization (EUROPT 2025)  2025.6 

     More details

    Language:English   Presentation type:Oral presentation (invited, special)  

    File: Kume-YamadaEUROPT2025.pdf

    researchmap

Awards

  • IEEE SPS Japan Student Conference Paper Award

    2024.12   IEEE SPS Tokyo Joint Chapter   A Variable Smoothing for Nonconvexly Constrained Nonsmooth Optimization with Application to Sparse Spectral Clustering

    Keita Kume

     More details

  • 令和2年度 信号処理若手奨励賞

    2021.11   電子情報通信学会信号処理研究会   A Global Cayley Parametrization of Stiefel Manifold for Direct Importing Optimization Mechanisms over Vector Space

     More details

  • 情報通信系優秀学生賞(修士)

    2021.3   東京工業大学情報通信系  

     More details

Research Projects

  • データ駆動型不動点制約付き最適化理論の深化とスパース信号処理への応用

    Grant number:26K21332  2026.4 - 2030.3

    日本学術振興会  科学研究費助成事業  若手研究

    久米 啓太

      More details

    Grant amount:\4550000 ( Direct Cost: \3500000 、 Indirect Cost:\1050000 )

    researchmap

  • 高精度な信号復元技術を実現するためのデータ駆動型制約付き最適化アルゴリズムの開発

    2026.4 - 2027.3

    電気通信普及財団  研究調査助成 

      More details

    Authorship:Principal investigator 

    researchmap

  • 低ランク制約付き非平滑最適化理論の構築とロバスト低ランク行列補完問題への応用

    Grant number:24K23885  2024.7 - 2026.3

    日本学術振興会  科学研究費助成事業  研究活動スタート支援

    久米 啓太

      More details

    Grant amount:\2860000 ( Direct Cost: \2200000 、 Indirect Cost:\660000 )

    researchmap

  • Stiefel多様体上最適化のための新Cayley変換理論とデータサイエンス応用

    Grant number:22KJ1270  2023.3 - 2024.3

    日本学術振興会  科学研究費助成事業  特別研究員奨励費

    久米 啓太, 久米 啓太

      More details

    Grant amount:\2200000 ( Direct Cost: \2200000 )

    本研究の目標は,多くのデータサイエンス技術応用の基盤的な問題「Stiefel多様体上最適化問題」を高速かつ数値安定的に求解できる新しい最適化戦略を実現することである.2022年度は「Stiefel多様体上最適化問題」をよりシンプルな「ユークリッド空間上最適化問題」に緩和して解くCayleyパラメータ表現法の数値不安定性解消に取り組んだ.Cayleyパラメータ表現法の数値不安定性は「Stiefel多様体上最適化問題」の緩和問題を解くことに起因している.提案している動的Cayleyパラメータ表現法は「Stiefel多様体上最適化問題」の等価問題である「複数の『ユークリッド空間上最適化問題』」を解くため,「Stiefel多様体上最適化問題」を数値安定的に求解できる.動的Cayleyパラメータ表現法では,それぞれの「ユークリッド空間上最適化問題」に対し既知のユークリッド空間上最適化アルゴリズムを適用可能である.よって,高速な収束性能を有するアルゴリズムを適用することで高速なStiefel多様体上最適化アルゴリズムの実現が期待できる.また,2022年度では動的Cayleyパラメータ表現法の統一的な収束解析に取り組んだ.この解析により,幅広いクラスのユークリッド空間上最適化アルゴリズム(勾配降下法や共役勾配法、Nesterov加速勾配法等)を動的Cayleyパラメータ表現法内で採用した場合に,生成点列の停留点に関する大域的収束性が保証される.
    Cayleyパラメータ表現法に関する研究成果を纏めた論文は数理最適化分野のQ1ジャーナル(Optimization)に掲載されている.動的Cayleyパラメータ表現法に関する研究成果を国内会議で複数発表している.

    researchmap