2025/08/15 18:50:43 extracted 303751 symbol hashes for base and 303751 for patched 2025/08/15 18:50:43 adding modified_functions to focus areas: ["__io_cqring_overflow_flush" "__io_issue_sqe" "__io_submit_flush_completions" "__se_sys_io_uring_enter" "ctx_flush_and_put" "io_add_aux_cqe" "io_fill_cqe_aux32" "io_handle_tw_list" "io_issue_sqe" "io_post_aux_cqe" "io_queue_async" "io_queue_iowq" "io_queue_next" "io_queue_sqe_fallback" "io_req_post_cqe" "io_req_queue_iowq_tw" "io_req_task_submit" "io_submit_sqes" "io_wq_free_work" "io_wq_submit_work" "nvmet_execute_disc_identify"] 2025/08/15 18:50:43 adding directly modified files to focus areas: ["include/linux/poison.h" "io_uring/io_uring.c"] 2025/08/15 18:50:44 downloaded the corpus from https://storage.googleapis.com/syzkaller/corpus/ci-upstream-kasan-gce-root-corpus.db 2025/08/15 18:51:34 runner 9 connected 2025/08/15 18:51:35 runner 2 connected 2025/08/15 18:51:35 runner 5 connected 2025/08/15 18:51:40 initializing coverage information... 2025/08/15 18:51:41 executor cover filter: 0 PCs 2025/08/15 18:51:42 runner 7 connected 2025/08/15 18:51:42 runner 6 connected 2025/08/15 18:51:42 runner 8 connected 2025/08/15 18:51:42 runner 1 connected 2025/08/15 18:51:42 runner 4 connected 2025/08/15 18:51:42 runner 0 connected 2025/08/15 18:51:42 runner 0 connected 2025/08/15 18:51:45 discovered 7699 source files, 338620 symbols 2025/08/15 18:51:45 coverage filter: __io_cqring_overflow_flush: [__io_cqring_overflow_flush] 2025/08/15 18:51:45 coverage filter: __io_issue_sqe: [__io_issue_sqe] 2025/08/15 18:51:45 coverage filter: __io_submit_flush_completions: [__io_submit_flush_completions] 2025/08/15 18:51:45 coverage filter: __se_sys_io_uring_enter: [__se_sys_io_uring_enter] 2025/08/15 18:51:45 coverage filter: ctx_flush_and_put: [ctx_flush_and_put] 2025/08/15 18:51:45 coverage filter: io_add_aux_cqe: [io_add_aux_cqe] 2025/08/15 18:51:45 coverage filter: io_fill_cqe_aux32: [io_fill_cqe_aux32] 2025/08/15 18:51:45 coverage filter: io_handle_tw_list: [io_handle_tw_list] 2025/08/15 18:51:45 coverage filter: io_issue_sqe: [io_issue_sqe] 2025/08/15 18:51:45 coverage filter: io_post_aux_cqe: [io_post_aux_cqe] 2025/08/15 18:51:45 coverage filter: io_queue_async: [io_queue_async] 2025/08/15 18:51:45 coverage filter: io_queue_iowq: [io_queue_iowq] 2025/08/15 18:51:45 coverage filter: io_queue_next: [io_queue_next] 2025/08/15 18:51:45 coverage filter: io_queue_sqe_fallback: [io_queue_sqe_fallback] 2025/08/15 18:51:45 coverage filter: io_req_post_cqe: [io_req_post_cqe io_req_post_cqe32] 2025/08/15 18:51:45 coverage filter: io_req_queue_iowq_tw: [io_req_queue_iowq_tw] 2025/08/15 18:51:45 coverage filter: io_req_task_submit: [io_req_task_submit] 2025/08/15 18:51:45 coverage filter: io_submit_sqes: [io_submit_sqes] 2025/08/15 18:51:45 coverage filter: io_wq_free_work: [io_wq_free_work] 2025/08/15 18:51:45 coverage filter: io_wq_submit_work: [io_wq_submit_work] 2025/08/15 18:51:45 coverage filter: nvmet_execute_disc_identify: [nvmet_execute_disc_identify] 2025/08/15 18:51:45 coverage filter: include/linux/poison.h: [] 2025/08/15 18:51:45 coverage filter: io_uring/io_uring.c: [io_uring/io_uring.c] 2025/08/15 18:51:45 area "symbols": 882 PCs in the cover filter 2025/08/15 18:51:45 area "files": 2520 PCs in the cover filter 2025/08/15 18:51:45 area "": 0 PCs in the cover filter 2025/08/15 18:51:45 executor cover filter: 0 PCs 2025/08/15 18:51:46 machine check: disabled the following syscalls: openat$acpi_thermal_rel : failed to open /dev/acpi_thermal_rel: no such file or directory openat$ashmem : failed to open /dev/ashmem: no such file or directory openat$bifrost : failed to open /dev/bifrost: no such file or directory openat$binder : failed to open /dev/binder: no such file or directory openat$camx : failed to open /dev/v4l/by-path/platform-soc@0:qcom_cam-req-mgr-video-index0: no such file or directory openat$capi20 : failed to open /dev/capi20: no such file or directory openat$cdrom1 : failed to open /dev/cdrom1: no such file or directory openat$damon_attrs : failed to open /sys/kernel/debug/damon/attrs: no such file or directory openat$damon_init_regions : failed to open /sys/kernel/debug/damon/init_regions: no such file or directory openat$damon_kdamond_pid : failed to open /sys/kernel/debug/damon/kdamond_pid: no such file or directory openat$damon_mk_contexts : failed to open /sys/kernel/debug/damon/mk_contexts: no such file or directory openat$damon_monitor_on : failed to open /sys/kernel/debug/damon/monitor_on: no such file or directory openat$damon_rm_contexts : failed to open /sys/kernel/debug/damon/rm_contexts: no such file or directory openat$damon_schemes : failed to open /sys/kernel/debug/damon/schemes: no such file or directory openat$damon_target_ids : failed to open /sys/kernel/debug/damon/target_ids: no such file or directory openat$hwbinder : failed to open /dev/hwbinder: no such file or directory openat$i915 : failed to open /dev/i915: no such file or directory openat$img_rogue : failed to open /dev/img-rogue: no such file or directory openat$irnet : failed to open /dev/irnet: no such file or directory openat$keychord : failed to open /dev/keychord: no such file or directory openat$kvm : failed to open /dev/kvm: no such file or directory openat$lightnvm : failed to open /dev/lightnvm/control: no such file or directory openat$mali : failed to open /dev/mali0: no such file or directory openat$md : failed to open /dev/md0: no such file or directory openat$msm : failed to open /dev/msm: no such file or directory openat$ndctl0 : failed to open /dev/ndctl0: no such file or directory openat$nmem0 : failed to open /dev/nmem0: no such file or directory openat$pktcdvd : failed to open /dev/pktcdvd/control: no such file or directory openat$pmem0 : failed to open /dev/pmem0: no such file or directory openat$proc_capi20 : failed to open /proc/capi/capi20: no such file or directory openat$proc_capi20ncci : failed to open /proc/capi/capi20ncci: no such file or directory openat$proc_reclaim : failed to open /proc/self/reclaim: no such file or directory openat$ptp1 : failed to open /dev/ptp1: no such file or directory openat$rnullb : failed to open /dev/rnullb0: no such file or directory openat$selinux_access : failed to open /selinux/access: no such file or directory openat$selinux_attr : selinux is not enabled openat$selinux_avc_cache_stats : failed to open /selinux/avc/cache_stats: no such file or directory openat$selinux_avc_cache_threshold : failed to open /selinux/avc/cache_threshold: no such file or directory openat$selinux_avc_hash_stats : failed to open /selinux/avc/hash_stats: no such file or directory openat$selinux_checkreqprot : failed to open /selinux/checkreqprot: no such file or directory openat$selinux_commit_pending_bools : failed to open /selinux/commit_pending_bools: no such file or directory openat$selinux_context : failed to open /selinux/context: no such file or directory openat$selinux_create : failed to open /selinux/create: no such file or directory openat$selinux_enforce : failed to open /selinux/enforce: no such file or directory openat$selinux_load : failed to open /selinux/load: no such file or directory openat$selinux_member : failed to open /selinux/member: no such file or directory openat$selinux_mls : failed to open /selinux/mls: no such file or directory openat$selinux_policy : failed to open /selinux/policy: no such file or directory openat$selinux_relabel : failed to open /selinux/relabel: no such file or directory openat$selinux_status : failed to open /selinux/status: no such file or directory openat$selinux_user : failed to open /selinux/user: no such file or directory openat$selinux_validatetrans : failed to open /selinux/validatetrans: no such file or directory openat$sev : failed to open /dev/sev: no such file or directory openat$sgx_provision : failed to open /dev/sgx_provision: no such file or directory openat$smack_task_current : smack is not enabled openat$smack_thread_current : smack is not enabled openat$smackfs_access : failed to open /sys/fs/smackfs/access: no such file or directory openat$smackfs_ambient : failed to open /sys/fs/smackfs/ambient: no such file or directory openat$smackfs_change_rule : failed to open /sys/fs/smackfs/change-rule: no such file or directory openat$smackfs_cipso : failed to open /sys/fs/smackfs/cipso: no such file or directory openat$smackfs_cipsonum : failed to open /sys/fs/smackfs/direct: no such file or directory openat$smackfs_ipv6host : failed to open /sys/fs/smackfs/ipv6host: no such file or directory openat$smackfs_load : failed to open /sys/fs/smackfs/load: no such file or directory openat$smackfs_logging : failed to open /sys/fs/smackfs/logging: no such file or directory openat$smackfs_netlabel : failed to open /sys/fs/smackfs/netlabel: no such file or directory openat$smackfs_onlycap : failed to open /sys/fs/smackfs/onlycap: no such file or directory openat$smackfs_ptrace : failed to open /sys/fs/smackfs/ptrace: no such file or directory openat$smackfs_relabel_self : failed to open /sys/fs/smackfs/relabel-self: no such file or directory openat$smackfs_revoke_subject : failed to open /sys/fs/smackfs/revoke-subject: no such file or directory openat$smackfs_syslog : failed to open /sys/fs/smackfs/syslog: no such file or directory openat$smackfs_unconfined : failed to open /sys/fs/smackfs/unconfined: no such file or directory openat$tlk_device : failed to open /dev/tlk_device: no such file or directory openat$trusty : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_avb : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_gatekeeper : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_hwkey : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_hwrng : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_km : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_km_secure : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_storage : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$tty : failed to open /dev/tty: no such device or address openat$uverbs0 : failed to open /dev/infiniband/uverbs0: no such file or directory openat$vfio : failed to open /dev/vfio/vfio: no such file or directory openat$vndbinder : failed to open /dev/vndbinder: no such file or directory openat$vtpm : failed to open /dev/vtpmx: no such file or directory openat$xenevtchn : failed to open /dev/xen/evtchn: no such file or directory openat$zygote : failed to open /dev/socket/zygote: no such file or directory socket$hf : socket$hf(0x13, 0x2, 0x0) failed: address family not supported by protocol socket$inet6_dccp : socket$inet6_dccp(0xa, 0x6, 0x0) failed: socket type not supported socket$inet_dccp : socket$inet_dccp(0x2, 0x6, 0x0) failed: socket type not supported socket$vsock_dgram : socket$vsock_dgram(0x28, 0x2, 0x0) failed: no such device transitively disabled the following syscalls (missing resource [creating syscalls]): accept$ax25 : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] accept$netrom : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] accept$nfc_llcp : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] close$binfmt : fd_binfmt [openat$binfmt] close$fd_v4l2_buffer : fd_v4l2_buffer [ioctl$VIDIOC_QUERYBUF_DMABUF] close$ibv_device : fd_rdma [openat$uverbs0] mmap$DRM_I915 : fd_i915 [openat$i915] mmap$DRM_MSM : fd_msm [openat$msm] mmap$KVM_VCPU : vcpu_mmap_size [ioctl$KVM_GET_VCPU_MMAP_SIZE] mmap$bifrost : fd_bifrost [openat$bifrost openat$mali] mmap$perf : fd_perf [perf_event_open perf_event_open$cgroup] mmap$snddsp : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$snddsp_control : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$snddsp_status : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$usbfs : fd_usbfs [syz_open_dev$usbfs] mmap$usbmon : fd_usbmon [syz_open_dev$usbmon] openat$binfmt : ptr_binfmt_file [syz_create_resource$binfmt] setsockopt$EBT_SO_SET_ENTRIES : uid [fstat fstat$auto geteuid ...] setsockopt$IP6T_SO_SET_REPLACE : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$IPT_SO_SET_REPLACE : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$SO_VM_SOCKETS_CONNECT_TIMEOUT_OLD: time_usec [getitimer getrusage getsockopt$sock_timeval ...] setsockopt$WPAN_SECURITY : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_SECURITY_LEVEL : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_WANTACK : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_WANTLQI : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$X25_QBITINCL : sock_x25 [accept4$x25 syz_init_net_socket$x25] setsockopt$ax25_SO_BINDTODEVICE : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] setsockopt$ax25_int : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] setsockopt$bt_BT_CHANNEL_POLICY : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_DEFER_SETUP : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_FLUSHABLE : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_POWER : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_RCVMTU : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_SECURITY : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_SNDMTU : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_VOICE : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_hci_HCI_DATA_DIR : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_hci_HCI_FILTER : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_hci_HCI_TIME_STAMP : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_l2cap_L2CAP_CONNINFO : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_l2cap_L2CAP_LM : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_l2cap_L2CAP_OPTIONS : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_rfcomm_RFCOMM_LM : sock_bt_rfcomm [syz_init_net_socket$bt_rfcomm] setsockopt$inet6_IPV6_IPSEC_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet6_IPV6_XFRM_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet6_dccp_buf : sock_dccp6 [socket$inet6_dccp] setsockopt$inet6_dccp_int : sock_dccp6 [socket$inet6_dccp] setsockopt$inet_IP_IPSEC_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet_IP_XFRM_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet_dccp_buf : sock_dccp [socket$inet_dccp] setsockopt$inet_dccp_int : sock_dccp [socket$inet_dccp] setsockopt$inet_sctp6_SCTP_ADD_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_ASSOCINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_DEACTIVATE_KEY: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_DELETE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_CONTEXT : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_PRINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_SEND_PARAM: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_SNDINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DELAYED_SACK : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_ENABLE_STREAM_RESET: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_MAXSEG : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_MAX_BURST : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PEER_ADDR_PARAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PEER_ADDR_THLDS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PRIMARY_ADDR : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PR_SUPPORTED : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RECONFIG_SUPPORTED: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RESET_ASSOC : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RESET_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RTOINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_SET_PEER_PRIMARY_ADDR: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_STREAM_SCHEDULER : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_STREAM_SCHEDULER_VALUE: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ADD_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ASSOCINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_ACTIVE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_DEACTIVATE_KEY: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_DELETE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_CONTEXT : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_PRINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_SEND_PARAM: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_SNDINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DELAYED_SACK : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ENABLE_STREAM_RESET: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_MAXSEG : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_MAX_BURST : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PEER_ADDR_PARAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PEER_ADDR_THLDS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PRIMARY_ADDR : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PR_SUPPORTED : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RECONFIG_SUPPORTED: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RESET_ASSOC : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RESET_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RTOINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_SET_PEER_PRIMARY_ADDR: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_STREAM_SCHEDULER : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_STREAM_SCHEDULER_VALUE: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$llc_int : sock_llc [accept4$llc syz_init_net_socket$llc] setsockopt$netrom_NETROM_IDLE : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_N2 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T1 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T2 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T4 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$nfc_llcp_NFC_LLCP_MIUX : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] setsockopt$nfc_llcp_NFC_LLCP_RW : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] setsockopt$rose : sock_rose [accept4$rose syz_init_net_socket$rose] setsockopt$sock_attach_bpf : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$sock_cred : pid [capget$auto capset$auto clone$auto ...] setsockopt$sock_timeval : time_usec [getitimer getrusage getsockopt$sock_timeval ...] syz_memcpy_off$KVM_EXIT_HYPERCALL : kvm_run_ptr [mmap$KVM_VCPU] syz_memcpy_off$KVM_EXIT_MMIO : kvm_run_ptr [mmap$KVM_VCPU] BinFmtMisc : enabled Comparisons : enabled Coverage : enabled DelayKcovMmap : enabled DevlinkPCI : PCI device 0000:00:10.0 is not available ExtraCoverage : enabled Fault : enabled KCSAN : write(/sys/kernel/debug/kcsan, on) failed KcovResetIoctl : kernel does not support ioctl(KCOV_RESET_TRACE) LRWPANEmulation : enabled Leak : failed to write(kmemleak, "scan=off") NetDevices : enabled NetInjection : enabled NicVF : PCI device 0000:00:11.0 is not available SandboxAndroid : setfilecon: setxattr failed. (errno 1: Operation not permitted). . process exited with status 67. SandboxNamespace : enabled SandboxNone : enabled SandboxSetuid : enabled Swap : enabled USBEmulation : enabled VhciInjection : enabled WifiEmulation : enabled syscalls : 450/8048 2025/08/15 18:51:46 base: machine check complete 2025/08/15 18:51:49 machine check: disabled the following syscalls: openat$acpi_thermal_rel : failed to open /dev/acpi_thermal_rel: no such file or directory openat$ashmem : failed to open /dev/ashmem: no such file or directory openat$bifrost : failed to open /dev/bifrost: no such file or directory openat$binder : failed to open /dev/binder: no such file or directory openat$camx : failed to open /dev/v4l/by-path/platform-soc@0:qcom_cam-req-mgr-video-index0: no such file or directory openat$capi20 : failed to open /dev/capi20: no such file or directory openat$cdrom1 : failed to open /dev/cdrom1: no such file or directory openat$damon_attrs : failed to open /sys/kernel/debug/damon/attrs: no such file or directory openat$damon_init_regions : failed to open /sys/kernel/debug/damon/init_regions: no such file or directory openat$damon_kdamond_pid : failed to open /sys/kernel/debug/damon/kdamond_pid: no such file or directory openat$damon_mk_contexts : failed to open /sys/kernel/debug/damon/mk_contexts: no such file or directory openat$damon_monitor_on : failed to open /sys/kernel/debug/damon/monitor_on: no such file or directory openat$damon_rm_contexts : failed to open /sys/kernel/debug/damon/rm_contexts: no such file or directory openat$damon_schemes : failed to open /sys/kernel/debug/damon/schemes: no such file or directory openat$damon_target_ids : failed to open /sys/kernel/debug/damon/target_ids: no such file or directory openat$hwbinder : failed to open /dev/hwbinder: no such file or directory openat$i915 : failed to open /dev/i915: no such file or directory openat$img_rogue : failed to open /dev/img-rogue: no such file or directory openat$irnet : failed to open /dev/irnet: no such file or directory openat$keychord : failed to open /dev/keychord: no such file or directory openat$kvm : failed to open /dev/kvm: no such file or directory openat$lightnvm : failed to open /dev/lightnvm/control: no such file or directory openat$mali : failed to open /dev/mali0: no such file or directory openat$md : failed to open /dev/md0: no such file or directory openat$msm : failed to open /dev/msm: no such file or directory openat$ndctl0 : failed to open /dev/ndctl0: no such file or directory openat$nmem0 : failed to open /dev/nmem0: no such file or directory openat$pktcdvd : failed to open /dev/pktcdvd/control: no such file or directory openat$pmem0 : failed to open /dev/pmem0: no such file or directory openat$proc_capi20 : failed to open /proc/capi/capi20: no such file or directory openat$proc_capi20ncci : failed to open /proc/capi/capi20ncci: no such file or directory openat$proc_reclaim : failed to open /proc/self/reclaim: no such file or directory openat$ptp1 : failed to open /dev/ptp1: no such file or directory openat$rnullb : failed to open /dev/rnullb0: no such file or directory openat$selinux_access : failed to open /selinux/access: no such file or directory openat$selinux_attr : selinux is not enabled openat$selinux_avc_cache_stats : failed to open /selinux/avc/cache_stats: no such file or directory openat$selinux_avc_cache_threshold : failed to open /selinux/avc/cache_threshold: no such file or directory openat$selinux_avc_hash_stats : failed to open /selinux/avc/hash_stats: no such file or directory openat$selinux_checkreqprot : failed to open /selinux/checkreqprot: no such file or directory openat$selinux_commit_pending_bools : failed to open /selinux/commit_pending_bools: no such file or directory openat$selinux_context : failed to open /selinux/context: no such file or directory openat$selinux_create : failed to open /selinux/create: no such file or directory openat$selinux_enforce : failed to open /selinux/enforce: no such file or directory openat$selinux_load : failed to open /selinux/load: no such file or directory openat$selinux_member : failed to open /selinux/member: no such file or directory openat$selinux_mls : failed to open /selinux/mls: no such file or directory openat$selinux_policy : failed to open /selinux/policy: no such file or directory openat$selinux_relabel : failed to open /selinux/relabel: no such file or directory openat$selinux_status : failed to open /selinux/status: no such file or directory openat$selinux_user : failed to open /selinux/user: no such file or directory openat$selinux_validatetrans : failed to open /selinux/validatetrans: no such file or directory openat$sev : failed to open /dev/sev: no such file or directory openat$sgx_provision : failed to open /dev/sgx_provision: no such file or directory openat$smack_task_current : smack is not enabled openat$smack_thread_current : smack is not enabled openat$smackfs_access : failed to open /sys/fs/smackfs/access: no such file or directory openat$smackfs_ambient : failed to open /sys/fs/smackfs/ambient: no such file or directory openat$smackfs_change_rule : failed to open /sys/fs/smackfs/change-rule: no such file or directory openat$smackfs_cipso : failed to open /sys/fs/smackfs/cipso: no such file or directory openat$smackfs_cipsonum : failed to open /sys/fs/smackfs/direct: no such file or directory openat$smackfs_ipv6host : failed to open /sys/fs/smackfs/ipv6host: no such file or directory openat$smackfs_load : failed to open /sys/fs/smackfs/load: no such file or directory openat$smackfs_logging : failed to open /sys/fs/smackfs/logging: no such file or directory openat$smackfs_netlabel : failed to open /sys/fs/smackfs/netlabel: no such file or directory openat$smackfs_onlycap : failed to open /sys/fs/smackfs/onlycap: no such file or directory openat$smackfs_ptrace : failed to open /sys/fs/smackfs/ptrace: no such file or directory openat$smackfs_relabel_self : failed to open /sys/fs/smackfs/relabel-self: no such file or directory openat$smackfs_revoke_subject : failed to open /sys/fs/smackfs/revoke-subject: no such file or directory openat$smackfs_syslog : failed to open /sys/fs/smackfs/syslog: no such file or directory openat$smackfs_unconfined : failed to open /sys/fs/smackfs/unconfined: no such file or directory openat$tlk_device : failed to open /dev/tlk_device: no such file or directory openat$trusty : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_avb : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_gatekeeper : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_hwkey : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_hwrng : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_km : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_km_secure : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$trusty_storage : failed to open /dev/trusty-ipc-dev0: no such file or directory openat$tty : failed to open /dev/tty: no such device or address openat$uverbs0 : failed to open /dev/infiniband/uverbs0: no such file or directory openat$vfio : failed to open /dev/vfio/vfio: no such file or directory openat$vndbinder : failed to open /dev/vndbinder: no such file or directory openat$vtpm : failed to open /dev/vtpmx: no such file or directory openat$xenevtchn : failed to open /dev/xen/evtchn: no such file or directory openat$zygote : failed to open /dev/socket/zygote: no such file or directory socket$hf : socket$hf(0x13, 0x2, 0x0) failed: address family not supported by protocol socket$inet6_dccp : socket$inet6_dccp(0xa, 0x6, 0x0) failed: socket type not supported socket$inet_dccp : socket$inet_dccp(0x2, 0x6, 0x0) failed: socket type not supported socket$vsock_dgram : socket$vsock_dgram(0x28, 0x2, 0x0) failed: no such device transitively disabled the following syscalls (missing resource [creating syscalls]): accept$ax25 : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] accept$netrom : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] accept$nfc_llcp : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] close$binfmt : fd_binfmt [openat$binfmt] close$fd_v4l2_buffer : fd_v4l2_buffer [ioctl$VIDIOC_QUERYBUF_DMABUF] close$ibv_device : fd_rdma [openat$uverbs0] mmap$DRM_I915 : fd_i915 [openat$i915] mmap$DRM_MSM : fd_msm [openat$msm] mmap$KVM_VCPU : vcpu_mmap_size [ioctl$KVM_GET_VCPU_MMAP_SIZE] mmap$bifrost : fd_bifrost [openat$bifrost openat$mali] mmap$perf : fd_perf [perf_event_open perf_event_open$cgroup] mmap$snddsp : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$snddsp_control : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$snddsp_status : fd_snd_dsp [syz_open_dev$sndpcmc syz_open_dev$sndpcmp] mmap$usbfs : fd_usbfs [syz_open_dev$usbfs] mmap$usbmon : fd_usbmon [syz_open_dev$usbmon] openat$binfmt : ptr_binfmt_file [syz_create_resource$binfmt] setsockopt$EBT_SO_SET_ENTRIES : uid [fstat fstat$auto geteuid ...] setsockopt$IP6T_SO_SET_REPLACE : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$IPT_SO_SET_REPLACE : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$SO_VM_SOCKETS_CONNECT_TIMEOUT_OLD: time_usec [getitimer getrusage getsockopt$sock_timeval ...] setsockopt$WPAN_SECURITY : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_SECURITY_LEVEL : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_WANTACK : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$WPAN_WANTLQI : sock_802154_dgram [syz_init_net_socket$802154_dgram] setsockopt$X25_QBITINCL : sock_x25 [accept4$x25 syz_init_net_socket$x25] setsockopt$ax25_SO_BINDTODEVICE : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] setsockopt$ax25_int : sock_ax25 [accept$ax25 accept4$ax25 syz_init_net_socket$ax25] setsockopt$bt_BT_CHANNEL_POLICY : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_DEFER_SETUP : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_FLUSHABLE : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_POWER : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_RCVMTU : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_SECURITY : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_SNDMTU : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_BT_VOICE : sock_bt [accept4$bt_l2cap syz_init_net_socket$bt_bnep syz_init_net_socket$bt_cmtp ...] setsockopt$bt_hci_HCI_DATA_DIR : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_hci_HCI_FILTER : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_hci_HCI_TIME_STAMP : sock_bt_hci [syz_init_net_socket$bt_hci] setsockopt$bt_l2cap_L2CAP_CONNINFO : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_l2cap_L2CAP_LM : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_l2cap_L2CAP_OPTIONS : sock_bt_l2cap [accept4$bt_l2cap syz_init_net_socket$bt_l2cap] setsockopt$bt_rfcomm_RFCOMM_LM : sock_bt_rfcomm [syz_init_net_socket$bt_rfcomm] setsockopt$inet6_IPV6_IPSEC_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet6_IPV6_XFRM_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet6_dccp_buf : sock_dccp6 [socket$inet6_dccp] setsockopt$inet6_dccp_int : sock_dccp6 [socket$inet6_dccp] setsockopt$inet_IP_IPSEC_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet_IP_XFRM_POLICY : uid [fstat fstat$auto geteuid ...] setsockopt$inet_dccp_buf : sock_dccp [socket$inet_dccp] setsockopt$inet_dccp_int : sock_dccp [socket$inet_dccp] setsockopt$inet_sctp6_SCTP_ADD_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_ASSOCINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_DEACTIVATE_KEY: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_DELETE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_AUTH_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_CONTEXT : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_PRINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_SEND_PARAM: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DEFAULT_SNDINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_DELAYED_SACK : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_ENABLE_STREAM_RESET: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_MAXSEG : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_MAX_BURST : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PEER_ADDR_PARAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PEER_ADDR_THLDS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PRIMARY_ADDR : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_PR_SUPPORTED : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RECONFIG_SUPPORTED: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RESET_ASSOC : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RESET_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_RTOINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_SET_PEER_PRIMARY_ADDR: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_STREAM_SCHEDULER : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp6_SCTP_STREAM_SCHEDULER_VALUE: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ADD_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ASSOCINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_ACTIVE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_DEACTIVATE_KEY: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_DELETE_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_AUTH_KEY : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_CONTEXT : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_PRINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_SEND_PARAM: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DEFAULT_SNDINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_DELAYED_SACK : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_ENABLE_STREAM_RESET: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_MAXSEG : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_MAX_BURST : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PEER_ADDR_PARAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PEER_ADDR_THLDS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PRIMARY_ADDR : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_PR_SUPPORTED : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RECONFIG_SUPPORTED: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RESET_ASSOC : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RESET_STREAMS : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_RTOINFO : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_SET_PEER_PRIMARY_ADDR: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_STREAM_SCHEDULER : assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$inet_sctp_SCTP_STREAM_SCHEDULER_VALUE: assoc_id [getsockopt$inet_sctp6_SCTP_ASSOCINFO getsockopt$inet_sctp6_SCTP_AUTH_ACTIVE_KEY getsockopt$inet_sctp6_SCTP_CONTEXT ...] setsockopt$llc_int : sock_llc [accept4$llc syz_init_net_socket$llc] setsockopt$netrom_NETROM_IDLE : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_N2 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T1 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T2 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$netrom_NETROM_T4 : sock_netrom [accept$netrom accept4$netrom syz_init_net_socket$netrom] setsockopt$nfc_llcp_NFC_LLCP_MIUX : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] setsockopt$nfc_llcp_NFC_LLCP_RW : sock_nfc_llcp [accept$nfc_llcp accept4$nfc_llcp syz_init_net_socket$nfc_llcp] setsockopt$rose : sock_rose [accept4$rose syz_init_net_socket$rose] setsockopt$sock_attach_bpf : fd_bpf_prog [bpf$BPF_PROG_GET_FD_BY_ID bpf$BPF_PROG_RAW_TRACEPOINT_LOAD bpf$BPF_PROG_WITH_BTFID_LOAD ...] setsockopt$sock_cred : pid [capget$auto capset$auto clone$auto ...] setsockopt$sock_timeval : time_usec [getitimer getrusage getsockopt$sock_timeval ...] syz_memcpy_off$KVM_EXIT_HYPERCALL : kvm_run_ptr [mmap$KVM_VCPU] syz_memcpy_off$KVM_EXIT_MMIO : kvm_run_ptr [mmap$KVM_VCPU] BinFmtMisc : enabled Comparisons : enabled Coverage : enabled DelayKcovMmap : enabled DevlinkPCI : PCI device 0000:00:10.0 is not available ExtraCoverage : enabled Fault : enabled KCSAN : write(/sys/kernel/debug/kcsan, on) failed KcovResetIoctl : kernel does not support ioctl(KCOV_RESET_TRACE) LRWPANEmulation : enabled Leak : failed to write(kmemleak, "scan=off") NetDevices : enabled NetInjection : enabled NicVF : PCI device 0000:00:11.0 is not available SandboxAndroid : setfilecon: setxattr failed. (errno 1: Operation not permitted). . process exited with status 67. SandboxNamespace : enabled SandboxNone : enabled SandboxSetuid : enabled Swap : enabled USBEmulation : enabled VhciInjection : enabled WifiEmulation : enabled syscalls : 450/8048 2025/08/15 18:51:49 new: machine check complete 2025/08/15 18:51:51 new: adding 48399 seeds 2025/08/15 18:52:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:52:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:52:40 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:52:40 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:21 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:21 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:22 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:22 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:26 runner 0 connected 2025/08/15 18:53:29 runner 6 connected 2025/08/15 18:53:30 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:30 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:31 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:31 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:32 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:32 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:34 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:34 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:47 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:47 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:53:49 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:53:49 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:54:09 runner 5 connected 2025/08/15 18:54:11 runner 4 connected 2025/08/15 18:54:19 runner 8 connected 2025/08/15 18:54:20 runner 9 connected 2025/08/15 18:54:23 runner 7 connected 2025/08/15 18:54:28 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:54:28 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:54:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:54:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:54:31 runner 6 connected 2025/08/15 18:54:35 runner 0 connected 2025/08/15 18:54:38 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:54:38 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:54:39 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:54:39 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:54:48 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:54:48 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:55:17 runner 5 connected 2025/08/15 18:55:20 runner 4 connected 2025/08/15 18:55:27 runner 8 connected 2025/08/15 18:55:28 runner 9 connected 2025/08/15 18:55:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:55:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:55:37 runner 7 connected 2025/08/15 18:55:39 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:55:39 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:55:45 STAT { "buffer too small": 0, "candidate triage jobs": 43, "candidates": 46321, "comps overflows": 0, "corpus": 2007, "corpus [files]": 524, "corpus [symbols]": 333, "cover overflows": 1930, "coverage": 30727, "distributor delayed": 2152, "distributor undelayed": 2111, "distributor violated": 62, "exec candidate": 2078, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 18657, "exec total [new]": 30006, "exec triage": 6349, "executor restarts": 88, "fault jobs": 0, "fuzzer jobs": 43, "fuzzing VMs [base]": 2, "fuzzing VMs [new]": 3, "hints jobs": 0, "max signal": 31230, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 2078, "no exec duration": 150858000000, "no exec requests": 2060, "pending": 17, "prog exec time": 61, "reproducing": 0, "rpc recv": 844413516, "rpc sent": 14387752, "signal": 30411, "smash jobs": 0, "triage jobs": 0, "vm output": 363831, "vm restarts [base]": 2, "vm restarts [new]": 22 } 2025/08/15 18:55:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:55:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:55:57 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:55:57 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:03 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:03 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:07 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:07 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:13 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:13 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:18 runner 0 connected 2025/08/15 18:56:28 runner 6 connected 2025/08/15 18:56:35 runner 4 connected 2025/08/15 18:56:38 runner 8 connected 2025/08/15 18:56:39 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:39 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:47 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:47 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:48 runner 5 connected 2025/08/15 18:56:51 runner 9 connected 2025/08/15 18:56:54 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:54 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:56:54 runner 7 connected 2025/08/15 18:56:57 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:56:57 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:07 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:57:07 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:10 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:57:10 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:20 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:57:20 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:21 runner 0 connected 2025/08/15 18:57:28 runner 6 connected 2025/08/15 18:57:35 runner 4 connected 2025/08/15 18:57:41 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:57:41 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:45 runner 8 connected 2025/08/15 18:57:51 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:57:51 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:57:56 runner 5 connected 2025/08/15 18:57:58 runner 9 connected 2025/08/15 18:58:01 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:58:01 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:58:02 runner 7 connected 2025/08/15 18:58:11 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:58:11 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:58:22 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:58:22 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:58:22 runner 0 connected 2025/08/15 18:58:32 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:58:32 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:58:40 runner 6 connected 2025/08/15 18:58:42 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:58:42 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:58:48 runner 4 connected 2025/08/15 18:59:00 runner 8 connected 2025/08/15 18:59:12 runner 5 connected 2025/08/15 18:59:20 runner 7 connected 2025/08/15 18:59:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:59:31 runner 9 connected 2025/08/15 18:59:36 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:36 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:59:40 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:40 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:59:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:59:56 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:56 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 18:59:56 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 18:59:56 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:00:06 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:00:06 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:00:17 runner 8 connected 2025/08/15 19:00:24 runner 0 connected 2025/08/15 19:00:28 runner 5 connected 2025/08/15 19:00:37 runner 6 connected 2025/08/15 19:00:37 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:00:37 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:00:38 runner 7 connected 2025/08/15 19:00:45 STAT { "buffer too small": 0, "candidate triage jobs": 34, "candidates": 45747, "comps overflows": 0, "corpus": 2579, "corpus [files]": 682, "corpus [symbols]": 435, "cover overflows": 2847, "coverage": 32954, "distributor delayed": 3189, "distributor undelayed": 3167, "distributor violated": 151, "exec candidate": 2652, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 34132, "exec total [new]": 45338, "exec triage": 8088, "executor restarts": 156, "fault jobs": 0, "fuzzer jobs": 34, "fuzzing VMs [base]": 2, "fuzzing VMs [new]": 3, "hints jobs": 0, "max signal": 33155, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 2652, "no exec duration": 481222000000, "no exec requests": 5217, "pending": 44, "prog exec time": 54, "reproducing": 0, "rpc recv": 1677922228, "rpc sent": 27276016, "signal": 32610, "smash jobs": 0, "triage jobs": 0, "vm output": 776736, "vm restarts [base]": 2, "vm restarts [new]": 48 } 2025/08/15 19:00:49 runner 9 connected 2025/08/15 19:00:51 new: boot error: can't ssh into the instance 2025/08/15 19:00:51 new: boot error: can't ssh into the instance 2025/08/15 19:00:51 base: boot error: can't ssh into the instance 2025/08/15 19:00:51 base: boot error: can't ssh into the instance 2025/08/15 19:00:53 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:00:53 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:01:03 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:01:03 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:01:13 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:01:13 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:01:23 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:01:23 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:01:26 runner 8 connected 2025/08/15 19:01:34 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:01:34 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:01:38 runner 3 connected 2025/08/15 19:01:40 runner 2 connected 2025/08/15 19:01:40 runner 1 connected 2025/08/15 19:01:40 runner 3 connected 2025/08/15 19:01:43 runner 0 connected 2025/08/15 19:01:52 runner 5 connected 2025/08/15 19:01:53 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:01:53 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:01 runner 7 connected 2025/08/15 19:02:03 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:03 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:04 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:04 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:12 runner 6 connected 2025/08/15 19:02:13 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:13 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:22 runner 9 connected 2025/08/15 19:02:24 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:24 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:34 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:02:34 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:02:42 runner 8 connected 2025/08/15 19:02:51 runner 2 connected 2025/08/15 19:02:52 runner 3 connected 2025/08/15 19:02:54 runner 0 connected 2025/08/15 19:02:56 runner 5 connected 2025/08/15 19:03:12 runner 7 connected 2025/08/15 19:03:12 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:12 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:23 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:23 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:24 runner 6 connected 2025/08/15 19:03:32 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:32 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:33 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:33 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:36 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:36 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:36 new: boot error: can't ssh into the instance 2025/08/15 19:03:42 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:42 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:42 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:42 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:03:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:03:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:01 runner 8 connected 2025/08/15 19:04:11 runner 5 connected 2025/08/15 19:04:17 runner 1 connected 2025/08/15 19:04:20 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:20 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:20 runner 0 connected 2025/08/15 19:04:21 runner 2 connected 2025/08/15 19:04:23 runner 7 connected 2025/08/15 19:04:23 runner 3 connected 2025/08/15 19:04:24 runner 9 connected 2025/08/15 19:04:27 runner 6 connected 2025/08/15 19:04:30 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:30 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:40 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:40 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:41 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:41 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:50 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:50 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:04:51 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:04:51 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:05:01 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:05:01 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:05:08 runner 8 connected 2025/08/15 19:05:20 runner 5 connected 2025/08/15 19:05:29 runner 0 connected 2025/08/15 19:05:39 runner 2 connected 2025/08/15 19:05:40 runner 7 connected 2025/08/15 19:05:42 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:05:42 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:05:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 45375, "comps overflows": 0, "corpus": 2979, "corpus [files]": 792, "corpus [symbols]": 506, "cover overflows": 4152, "coverage": 34091, "distributor delayed": 3915, "distributor undelayed": 3915, "distributor violated": 214, "exec candidate": 3024, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 57078, "exec total [new]": 68378, "exec triage": 9295, "executor restarts": 246, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 4, "hints jobs": 0, "max signal": 34191, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3024, "no exec duration": 1556580000000, "no exec requests": 14741, "pending": 72, "prog exec time": 60, "reproducing": 0, "rpc recv": 2661929376, "rpc sent": 43089472, "signal": 33740, "smash jobs": 0, "triage jobs": 0, "vm output": 1307471, "vm restarts [base]": 4, "vm restarts [new]": 78 } 2025/08/15 19:06:18 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:18 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:18 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:18 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:20 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:20 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:23 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:23 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:29 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:29 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:39 runner 3 connected 2025/08/15 19:06:59 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:06:59 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:06:59 runner 2 connected 2025/08/15 19:07:00 runner 7 connected 2025/08/15 19:07:01 runner 8 connected 2025/08/15 19:07:04 runner 5 connected 2025/08/15 19:07:10 runner 6 connected 2025/08/15 19:07:10 runner 0 connected 2025/08/15 19:07:18 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:07:18 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:07:19 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:07:19 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:07:28 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:07:28 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:07:39 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:07:39 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:07:46 runner 3 connected 2025/08/15 19:07:50 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:07:50 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:01 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:08:01 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:07 runner 2 connected 2025/08/15 19:08:08 runner 7 connected 2025/08/15 19:08:11 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:08:11 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:17 runner 5 connected 2025/08/15 19:08:20 runner 0 connected 2025/08/15 19:08:37 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:08:37 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:38 runner 8 connected 2025/08/15 19:08:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:08:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:47 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:08:47 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:08:51 runner 6 connected 2025/08/15 19:09:07 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:09:07 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:09:17 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:09:17 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:09:26 runner 7 connected 2025/08/15 19:09:35 runner 2 connected 2025/08/15 19:09:35 runner 5 connected 2025/08/15 19:09:52 new: boot error: can't ssh into the instance 2025/08/15 19:09:56 runner 8 connected 2025/08/15 19:10:06 runner 0 connected 2025/08/15 19:10:08 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:08 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:16 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:16 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:18 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:18 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:24 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:24 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:27 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:27 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:10:39 runner 4 connected 2025/08/15 19:10:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 45362, "comps overflows": 0, "corpus": 2983, "corpus [files]": 795, "corpus [symbols]": 508, "cover overflows": 5536, "coverage": 34117, "distributor delayed": 3936, "distributor undelayed": 3936, "distributor violated": 217, "exec candidate": 3037, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 81095, "exec total [new]": 92407, "exec triage": 9350, "executor restarts": 304, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 0, "hints jobs": 0, "max signal": 34228, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3037, "no exec duration": 2794816000000, "no exec requests": 24361, "pending": 97, "prog exec time": 0, "reproducing": 0, "rpc recv": 3322948332, "rpc sent": 57176760, "signal": 33766, "smash jobs": 0, "triage jobs": 0, "vm output": 1639173, "vm restarts [base]": 4, "vm restarts [new]": 98 } 2025/08/15 19:10:55 runner 6 connected 2025/08/15 19:10:57 runner 7 connected 2025/08/15 19:10:57 runner 2 connected 2025/08/15 19:10:58 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:10:58 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:11:05 runner 5 connected 2025/08/15 19:11:06 runner 8 connected 2025/08/15 19:11:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:11:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:11:15 runner 0 connected 2025/08/15 19:11:36 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:11:36 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:11:43 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:11:43 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:11:45 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:11:45 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:11:48 runner 4 connected 2025/08/15 19:11:55 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:11:55 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:12:02 runner 6 connected 2025/08/15 19:12:05 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:12:05 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:12:15 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:12:15 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:12:24 runner 7 connected 2025/08/15 19:12:24 runner 2 connected 2025/08/15 19:12:26 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:12:26 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:12:26 runner 0 connected 2025/08/15 19:12:37 runner 5 connected 2025/08/15 19:13:05 runner 4 connected 2025/08/15 19:13:05 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:05 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:13:05 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:05 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:13:05 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:05 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:13:14 runner 6 connected 2025/08/15 19:13:15 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:15 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:13:46 runner 0 connected 2025/08/15 19:13:48 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:48 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:13:53 runner 5 connected 2025/08/15 19:13:58 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:13:58 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:14:03 runner 2 connected 2025/08/15 19:14:11 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:14:11 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:14:36 runner 6 connected 2025/08/15 19:14:46 new: boot error: can't ssh into the instance 2025/08/15 19:14:58 runner 0 connected 2025/08/15 19:15:04 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:15:04 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:15:06 new: boot error: can't ssh into the instance 2025/08/15 19:15:15 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:15:15 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:15:25 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:15:25 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:15:27 runner 1 connected 2025/08/15 19:15:35 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:15:35 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:15:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 45341, "comps overflows": 0, "corpus": 2983, "corpus [files]": 795, "corpus [symbols]": 508, "cover overflows": 6848, "coverage": 34117, "distributor delayed": 3946, "distributor undelayed": 3946, "distributor violated": 220, "exec candidate": 3058, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 104616, "exec total [new]": 115970, "exec triage": 9385, "executor restarts": 361, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 0, "hints jobs": 0, "max signal": 34243, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3044, "no exec duration": 4376329000000, "no exec requests": 37188, "pending": 117, "prog exec time": 0, "reproducing": 0, "rpc recv": 3952209224, "rpc sent": 70910864, "signal": 33766, "smash jobs": 0, "triage jobs": 0, "vm output": 1978215, "vm restarts [base]": 4, "vm restarts [new]": 118 } 2025/08/15 19:15:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:15:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:15:47 runner 9 connected 2025/08/15 19:15:56 runner 5 connected 2025/08/15 19:16:06 runner 0 connected 2025/08/15 19:16:23 runner 6 connected 2025/08/15 19:16:28 runner 1 connected 2025/08/15 19:16:45 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:16:45 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:16:45 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:16:45 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:16:51 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:16:51 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:17:12 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:17:12 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:17:22 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:17:22 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:17:33 runner 5 connected 2025/08/15 19:17:34 runner 0 connected 2025/08/15 19:17:51 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:17:51 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:17:53 runner 1 connected 2025/08/15 19:18:01 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:18:01 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:18:04 runner 6 connected 2025/08/15 19:18:12 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:18:12 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:18:16 new: boot error: can't ssh into the instance 2025/08/15 19:18:32 runner 5 connected 2025/08/15 19:18:43 runner 0 connected 2025/08/15 19:18:57 runner 3 connected 2025/08/15 19:19:01 runner 1 connected 2025/08/15 19:19:04 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:19:04 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:19:05 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:19:05 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:19:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:19:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:19:16 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:19:16 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:19:24 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:19:24 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:19:45 runner 6 connected 2025/08/15 19:19:56 runner 0 connected 2025/08/15 19:19:57 runner 3 connected 2025/08/15 19:20:03 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:20:03 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:20:06 runner 1 connected 2025/08/15 19:20:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:20:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:20:16 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:20:16 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:20:24 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:20:24 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:20:44 runner 6 connected 2025/08/15 19:20:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 45264, "comps overflows": 0, "corpus": 2983, "corpus [files]": 795, "corpus [symbols]": 508, "cover overflows": 7596, "coverage": 34117, "distributor delayed": 3946, "distributor undelayed": 3946, "distributor violated": 220, "exec candidate": 3135, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 117910, "exec total [new]": 129277, "exec triage": 9390, "executor restarts": 406, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 0, "hints jobs": 0, "max signal": 34244, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3045, "no exec duration": 5556316000000, "no exec requests": 45554, "pending": 135, "prog exec time": 0, "reproducing": 0, "rpc recv": 4483597332, "rpc sent": 80148328, "signal": 33766, "smash jobs": 0, "triage jobs": 0, "vm output": 2251537, "vm restarts [base]": 4, "vm restarts [new]": 136 } 2025/08/15 19:20:55 runner 0 connected 2025/08/15 19:20:57 runner 3 connected 2025/08/15 19:21:03 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:21:03 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:21:05 runner 1 connected 2025/08/15 19:21:15 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:21:15 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:21:34 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:21:34 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:21:44 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:21:44 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:21:52 runner 6 connected 2025/08/15 19:21:56 runner 3 connected 2025/08/15 19:22:11 new: boot error: can't ssh into the instance 2025/08/15 19:22:15 runner 1 connected 2025/08/15 19:22:26 runner 0 connected 2025/08/15 19:22:56 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:22:56 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:22:56 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:22:56 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:23:01 runner 8 connected 2025/08/15 19:23:02 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:23:02 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:23:06 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:23:06 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:23:11 new: boot error: can't ssh into the instance 2025/08/15 19:23:20 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:23:20 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:23:37 runner 0 connected 2025/08/15 19:23:37 runner 3 connected 2025/08/15 19:23:47 runner 1 connected 2025/08/15 19:23:52 runner 7 connected 2025/08/15 19:23:55 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:23:55 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:24:01 runner 8 connected 2025/08/15 19:24:04 new: boot error: can't ssh into the instance 2025/08/15 19:24:26 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:24:26 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:24:32 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:24:32 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:24:45 runner 0 connected 2025/08/15 19:24:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:24:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:24:53 runner 4 connected 2025/08/15 19:25:10 new: boot error: can't ssh into the instance 2025/08/15 19:25:16 runner 7 connected 2025/08/15 19:25:21 runner 1 connected 2025/08/15 19:25:26 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:25:26 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:25:27 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:25:27 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:25:35 runner 8 connected 2025/08/15 19:25:36 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:25:36 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:25:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 42570, "comps overflows": 0, "corpus": 2984, "corpus [files]": 795, "corpus [symbols]": 508, "cover overflows": 8872, "coverage": 34122, "distributor delayed": 3957, "distributor undelayed": 3957, "distributor violated": 222, "exec candidate": 5829, "exec collide": 0, "exec fuzz": 0, "exec gen": 0, "exec hints": 0, "exec inject": 0, "exec minimize": 0, "exec retries": 0, "exec seeds": 0, "exec smash": 0, "exec total [base]": 140253, "exec total [new]": 151635, "exec triage": 9437, "executor restarts": 458, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 2, "hints jobs": 0, "max signal": 34272, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 0, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3055, "no exec duration": 7013635000000, "no exec requests": 57712, "pending": 151, "prog exec time": 46, "reproducing": 0, "rpc recv": 5081946604, "rpc sent": 93157040, "signal": 33770, "smash jobs": 0, "triage jobs": 0, "vm output": 2556146, "vm restarts [base]": 4, "vm restarts [new]": 154 } 2025/08/15 19:25:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:25:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:25:56 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:25:56 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:26:04 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:26:04 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:26:08 runner 4 connected 2025/08/15 19:26:17 runner 0 connected 2025/08/15 19:26:27 runner 7 connected 2025/08/15 19:26:28 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:26:28 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:26:38 runner 8 connected 2025/08/15 19:26:46 runner 1 connected 2025/08/15 19:26:51 new: boot error: can't ssh into the instance 2025/08/15 19:27:16 runner 4 connected 2025/08/15 19:27:41 runner 9 connected 2025/08/15 19:29:10 new: boot error: can't ssh into the instance 2025/08/15 19:29:15 triaged 96.3% of the corpus 2025/08/15 19:29:15 starting bug reproductions 2025/08/15 19:29:15 starting bug reproductions (max 10 VMs, 7 repros) 2025/08/15 19:29:15 start reproducing 'general protection fault in __io_queue_proc' 2025/08/15 19:29:45 triaged 100.0% of the corpus 2025/08/15 19:30:04 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:30:04 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:30:07 runner 5 connected 2025/08/15 19:30:14 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:30:14 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:30:41 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:30:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 24, "corpus": 3106, "corpus [files]": 836, "corpus [symbols]": 545, "cover overflows": 11992, "coverage": 35128, "distributor delayed": 4163, "distributor undelayed": 4160, "distributor violated": 235, "exec candidate": 48399, "exec collide": 313, "exec fuzz": 539, "exec gen": 25, "exec hints": 138, "exec inject": 0, "exec minimize": 2028, "exec retries": 0, "exec seeds": 190, "exec smash": 365, "exec total [base]": 184088, "exec total [new]": 198242, "exec triage": 9861, "executor restarts": 482, "fault jobs": 0, "fuzzer jobs": 241, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 3, "hints jobs": 100, "max signal": 35592, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 1253, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3226, "no exec duration": 7356548000000, "no exec requests": 60713, "pending": 156, "prog exec time": 338, "reproducing": 1, "rpc recv": 5399617244, "rpc sent": 115164496, "signal": 34755, "smash jobs": 121, "triage jobs": 20, "vm output": 2760238, "vm restarts [base]": 4, "vm restarts [new]": 162 } 2025/08/15 19:31:00 runner 9 connected 2025/08/15 19:31:04 runner 7 connected 2025/08/15 19:32:43 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:32:43 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:33:08 new: boot error: can't ssh into the instance 2025/08/15 19:33:39 runner 5 connected 2025/08/15 19:34:06 runner 6 connected 2025/08/15 19:34:25 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:35:14 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:35:16 new: boot error: can't ssh into the instance 2025/08/15 19:35:32 new: boot error: can't ssh into the instance 2025/08/15 19:35:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 380, "corpus": 3485, "corpus [files]": 945, "corpus [symbols]": 643, "cover overflows": 17081, "coverage": 36764, "distributor delayed": 4554, "distributor undelayed": 4554, "distributor violated": 240, "exec candidate": 48399, "exec collide": 2187, "exec fuzz": 4210, "exec gen": 213, "exec hints": 1289, "exec inject": 0, "exec minimize": 9506, "exec retries": 0, "exec seeds": 1314, "exec smash": 3817, "exec total [base]": 201406, "exec total [new]": 218310, "exec triage": 10983, "executor restarts": 496, "fault jobs": 0, "fuzzer jobs": 773, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 5, "hints jobs": 254, "max signal": 37587, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 5876, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 3701, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 157, "prog exec time": 228, "reproducing": 1, "rpc recv": 5821497344, "rpc sent": 147298864, "signal": 36118, "smash jobs": 505, "triage jobs": 14, "vm output": 3424708, "vm restarts [base]": 4, "vm restarts [new]": 166 } 2025/08/15 19:35:46 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 19:35:46 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 19:35:50 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:35:50 repro finished 'general protection fault in __io_queue_proc', repro=true crepro=false desc='general protection fault in __io_queue_proc' hub=false from_dashboard=false 2025/08/15 19:35:50 found repro for "general protection fault in __io_queue_proc" (orig title: "-SAME-", reliability: 1), took 6.57 minutes 2025/08/15 19:35:50 "general protection fault in __io_queue_proc": saved crash log into 1755286550.crash.log 2025/08/15 19:35:50 "general protection fault in __io_queue_proc": saved repro log into 1755286550.repro.log 2025/08/15 19:35:50 start reproducing 'general protection fault in __io_queue_proc' 2025/08/15 19:35:57 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:36:07 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:36:12 runner 2 connected 2025/08/15 19:36:35 runner 5 connected 2025/08/15 19:36:40 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:36:46 runner 4 connected 2025/08/15 19:36:52 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:37:04 runner 6 connected 2025/08/15 19:37:37 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:37:42 attempt #0 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:37:48 runner 7 connected 2025/08/15 19:37:55 patched crashed: WARNING in io_ring_exit_work [need repro = true] 2025/08/15 19:37:55 scheduled a reproduction of 'WARNING in io_ring_exit_work' 2025/08/15 19:37:55 start reproducing 'WARNING in io_ring_exit_work' 2025/08/15 19:38:26 runner 6 connected 2025/08/15 19:38:53 runner 8 connected 2025/08/15 19:39:12 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:39:43 attempt #1 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:40:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 610, "corpus": 3744, "corpus [files]": 1034, "corpus [symbols]": 723, "cover overflows": 21382, "coverage": 38086, "distributor delayed": 4817, "distributor undelayed": 4817, "distributor violated": 241, "exec candidate": 48399, "exec collide": 4630, "exec fuzz": 8691, "exec gen": 448, "exec hints": 2688, "exec inject": 0, "exec minimize": 13838, "exec retries": 0, "exec seeds": 2110, "exec smash": 8785, "exec total [base]": 215049, "exec total [new]": 237683, "exec triage": 11702, "executor restarts": 519, "fault jobs": 0, "fuzzer jobs": 1030, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 5, "hints jobs": 291, "max signal": 39042, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 8496, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4011, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 157, "prog exec time": 199, "reproducing": 2, "rpc recv": 6232046152, "rpc sent": 177826032, "signal": 37429, "smash jobs": 730, "triage jobs": 9, "vm output": 3949143, "vm restarts [base]": 4, "vm restarts [new]": 173 } 2025/08/15 19:41:36 attempt #2 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:41:36 patched-only: general protection fault in __io_queue_proc 2025/08/15 19:41:36 scheduled a reproduction of 'general protection fault in __io_queue_proc (full)' 2025/08/15 19:41:36 start reproducing 'general protection fault in __io_queue_proc (full)' 2025/08/15 19:41:46 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:42:37 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:42:41 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:42:44 runner 9 connected 2025/08/15 19:42:57 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:43:34 runner 8 connected 2025/08/15 19:43:37 runner 6 connected 2025/08/15 19:43:50 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:44:46 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:44:54 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:45:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:45:35 runner 6 connected 2025/08/15 19:45:37 new: boot error: can't ssh into the instance 2025/08/15 19:45:43 runner 7 connected 2025/08/15 19:45:45 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:45:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 791, "corpus": 3893, "corpus [files]": 1087, "corpus [symbols]": 768, "cover overflows": 24898, "coverage": 38659, "distributor delayed": 5023, "distributor undelayed": 5019, "distributor violated": 250, "exec candidate": 48399, "exec collide": 6591, "exec fuzz": 12454, "exec gen": 645, "exec hints": 3730, "exec inject": 0, "exec minimize": 16762, "exec retries": 0, "exec seeds": 2633, "exec smash": 13139, "exec total [base]": 229145, "exec total [new]": 252891, "exec triage": 12146, "executor restarts": 531, "fault jobs": 0, "fuzzer jobs": 1040, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 3, "hints jobs": 256, "max signal": 39724, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 10127, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 0, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4202, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 157, "prog exec time": 245, "reproducing": 3, "rpc recv": 6503071532, "rpc sent": 206292608, "signal": 37943, "smash jobs": 771, "triage jobs": 13, "vm output": 4427828, "vm restarts [base]": 4, "vm restarts [new]": 178 } 2025/08/15 19:46:01 runner 4 connected 2025/08/15 19:47:10 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:47:10 repro finished 'general protection fault in __io_queue_proc', repro=true crepro=false desc='general protection fault in __io_queue_proc' hub=false from_dashboard=false 2025/08/15 19:47:10 found repro for "general protection fault in __io_queue_proc" (orig title: "-SAME-", reliability: 1), took 11.34 minutes 2025/08/15 19:47:10 start reproducing 'general protection fault in __io_queue_proc' 2025/08/15 19:47:10 "general protection fault in __io_queue_proc": saved crash log into 1755287230.crash.log 2025/08/15 19:47:10 "general protection fault in __io_queue_proc": saved repro log into 1755287230.repro.log 2025/08/15 19:47:34 new: boot error: can't ssh into the instance 2025/08/15 19:48:44 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:49:18 new: boot error: can't ssh into the instance 2025/08/15 19:49:33 runner 9 connected 2025/08/15 19:50:00 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:50:17 runner 5 connected 2025/08/15 19:50:19 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:50:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 903, "corpus": 4051, "corpus [files]": 1134, "corpus [symbols]": 810, "cover overflows": 29073, "coverage": 39352, "distributor delayed": 5197, "distributor undelayed": 5195, "distributor violated": 250, "exec candidate": 48399, "exec collide": 8950, "exec fuzz": 17043, "exec gen": 883, "exec hints": 5018, "exec inject": 0, "exec minimize": 19447, "exec retries": 0, "exec seeds": 3126, "exec smash": 18550, "exec total [base]": 241462, "exec total [new]": 270433, "exec triage": 12620, "executor restarts": 547, "fault jobs": 0, "fuzzer jobs": 942, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 4, "hints jobs": 230, "max signal": 40702, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 11711, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 1, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4399, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 156, "prog exec time": 230, "reproducing": 3, "rpc recv": 6744877852, "rpc sent": 236256184, "signal": 38512, "smash jobs": 705, "triage jobs": 7, "vm output": 5051653, "vm restarts [base]": 4, "vm restarts [new]": 181 } 2025/08/15 19:51:15 runner 8 connected 2025/08/15 19:51:42 base: boot error: can't ssh into the instance 2025/08/15 19:52:09 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:52:31 runner 2 connected 2025/08/15 19:53:06 runner 5 connected 2025/08/15 19:53:13 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:53:41 attempt #0 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:54:01 runner 6 connected 2025/08/15 19:55:09 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:55:19 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:55:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:55:40 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:55:41 attempt #1 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:55:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1021, "corpus": 4180, "corpus [files]": 1161, "corpus [symbols]": 834, "cover overflows": 32569, "coverage": 40359, "distributor delayed": 5343, "distributor undelayed": 5338, "distributor violated": 255, "exec candidate": 48399, "exec collide": 11192, "exec fuzz": 21221, "exec gen": 1109, "exec hints": 6235, "exec inject": 0, "exec minimize": 21808, "exec retries": 0, "exec seeds": 3552, "exec smash": 23548, "exec total [base]": 252439, "exec total [new]": 286434, "exec triage": 12979, "executor restarts": 556, "fault jobs": 0, "fuzzer jobs": 793, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 1, "hints jobs": 196, "max signal": 41585, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 13057, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4558, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 156, "prog exec time": 255, "reproducing": 3, "rpc recv": 6968286992, "rpc sent": 263570976, "signal": 39413, "smash jobs": 591, "triage jobs": 6, "vm output": 5945751, "vm restarts [base]": 5, "vm restarts [new]": 184 } 2025/08/15 19:55:47 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:56:05 runner 8 connected 2025/08/15 19:56:09 runner 7 connected 2025/08/15 19:56:19 runner 5 connected 2025/08/15 19:56:26 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:56:29 runner 6 connected 2025/08/15 19:57:35 attempt #2 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 19:57:35 patched-only: general protection fault in __io_queue_proc 2025/08/15 19:57:35 scheduled a reproduction of 'general protection fault in __io_queue_proc (full)' 2025/08/15 19:57:40 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:58:20 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:58:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:58:32 runner 0 connected 2025/08/15 19:58:51 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 19:59:09 runner 6 connected 2025/08/15 19:59:28 runner 8 connected 2025/08/15 19:59:38 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 19:59:45 fuzzer has reached the modified code (855 + 1183 + 0), continuing fuzzing 2025/08/15 20:00:02 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:00:06 new: boot error: can't ssh into the instance 2025/08/15 20:00:35 runner 7 connected 2025/08/15 20:00:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1094, "corpus": 4330, "corpus [files]": 1193, "corpus [symbols]": 863, "cover overflows": 35913, "coverage": 41195, "distributor delayed": 5558, "distributor undelayed": 5558, "distributor violated": 262, "exec candidate": 48399, "exec collide": 13062, "exec fuzz": 24836, "exec gen": 1307, "exec hints": 7321, "exec inject": 0, "exec minimize": 24455, "exec retries": 0, "exec seeds": 3963, "exec smash": 27730, "exec total [base]": 265317, "exec total [new]": 300915, "exec triage": 13442, "executor restarts": 575, "fault jobs": 0, "fuzzer jobs": 706, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 5, "hints jobs": 202, "max signal": 42613, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 14486, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4756, "no exec duration": 7454904000000, "no exec requests": 61236, "pending": 157, "prog exec time": 290, "reproducing": 3, "rpc recv": 7312842236, "rpc sent": 292234896, "signal": 40120, "smash jobs": 491, "triage jobs": 13, "vm output": 6823530, "vm restarts [base]": 6, "vm restarts [new]": 191 } 2025/08/15 20:01:03 runner 4 connected 2025/08/15 20:01:14 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:01:49 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:02:00 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:02:01 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:02:09 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:02:29 runner 3 connected 2025/08/15 20:02:37 runner 1 connected 2025/08/15 20:02:38 runner 7 connected 2025/08/15 20:02:43 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:02:48 runner 8 connected 2025/08/15 20:02:58 runner 5 connected 2025/08/15 20:03:21 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:03:27 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:03:38 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:03:45 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:03:56 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:04:17 runner 6 connected 2025/08/15 20:04:27 runner 7 connected 2025/08/15 20:04:34 runner 5 connected 2025/08/15 20:04:44 runner 4 connected 2025/08/15 20:05:06 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:05:17 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:05:27 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:05:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1177, "corpus": 4462, "corpus [files]": 1214, "corpus [symbols]": 882, "cover overflows": 38380, "coverage": 42340, "distributor delayed": 5722, "distributor undelayed": 5704, "distributor violated": 269, "exec candidate": 48399, "exec collide": 14572, "exec fuzz": 27611, "exec gen": 1439, "exec hints": 8152, "exec inject": 0, "exec minimize": 26603, "exec retries": 0, "exec seeds": 4390, "exec smash": 30894, "exec total [base]": 279005, "exec total [new]": 312203, "exec triage": 13748, "executor restarts": 603, "fault jobs": 0, "fuzzer jobs": 665, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 1, "hints jobs": 175, "max signal": 43822, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 15567, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 4910, "no exec duration": 7867205000000, "no exec requests": 62850, "pending": 157, "prog exec time": 166, "reproducing": 3, "rpc recv": 7713762696, "rpc sent": 317825088, "signal": 41232, "smash jobs": 472, "triage jobs": 18, "vm output": 7433450, "vm restarts [base]": 8, "vm restarts [new]": 199 } 2025/08/15 20:05:55 runner 4 connected 2025/08/15 20:06:05 runner 5 connected 2025/08/15 20:06:09 runner 6 connected 2025/08/15 20:06:18 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:06:32 new: boot error: can't ssh into the instance 2025/08/15 20:06:32 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:07:06 runner 4 connected 2025/08/15 20:07:20 runner 5 connected 2025/08/15 20:07:28 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:07:53 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:08:16 runner 7 connected 2025/08/15 20:08:44 runner 5 connected 2025/08/15 20:09:45 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:10:11 base crash: WARNING in io_ring_exit_work 2025/08/15 20:10:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1222, "corpus": 4567, "corpus [files]": 1241, "corpus [symbols]": 908, "cover overflows": 41224, "coverage": 42690, "distributor delayed": 5878, "distributor undelayed": 5878, "distributor violated": 296, "exec candidate": 48399, "exec collide": 16358, "exec fuzz": 31077, "exec gen": 1624, "exec hints": 9007, "exec inject": 0, "exec minimize": 28691, "exec retries": 0, "exec seeds": 4752, "exec smash": 35109, "exec total [base]": 291632, "exec total [new]": 325460, "exec triage": 14041, "executor restarts": 624, "fault jobs": 0, "fuzzer jobs": 508, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 3, "hints jobs": 103, "max signal": 44188, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 16675, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5023, "no exec duration": 8952578000000, "no exec requests": 67954, "pending": 157, "prog exec time": 237, "reproducing": 3, "rpc recv": 8011716812, "rpc sent": 344629488, "signal": 41554, "smash jobs": 402, "triage jobs": 3, "vm output": 8059610, "vm restarts [base]": 8, "vm restarts [new]": 206 } 2025/08/15 20:10:55 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:11:00 runner 3 connected 2025/08/15 20:11:19 new: boot error: can't ssh into the instance 2025/08/15 20:11:44 runner 4 connected 2025/08/15 20:12:07 new: boot error: can't ssh into the instance 2025/08/15 20:12:10 new: boot error: can't ssh into the instance 2025/08/15 20:12:11 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:13:00 runner 7 connected 2025/08/15 20:13:27 new: boot error: can't ssh into the instance 2025/08/15 20:13:38 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:13:38 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:13:49 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:14:17 runner 8 connected 2025/08/15 20:14:28 runner 5 connected 2025/08/15 20:14:39 runner 6 connected 2025/08/15 20:14:55 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:15:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1282, "corpus": 4684, "corpus [files]": 1265, "corpus [symbols]": 931, "cover overflows": 44087, "coverage": 43092, "distributor delayed": 6025, "distributor undelayed": 6025, "distributor violated": 299, "exec candidate": 48399, "exec collide": 18364, "exec fuzz": 34863, "exec gen": 1807, "exec hints": 9938, "exec inject": 0, "exec minimize": 31024, "exec retries": 0, "exec seeds": 5122, "exec smash": 39789, "exec total [base]": 306169, "exec total [new]": 340058, "exec triage": 14347, "executor restarts": 639, "fault jobs": 0, "fuzzer jobs": 327, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 5, "hints jobs": 72, "max signal": 44631, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 17937, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5165, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 157, "prog exec time": 232, "reproducing": 3, "rpc recv": 8282265488, "rpc sent": 374622568, "signal": 41892, "smash jobs": 244, "triage jobs": 11, "vm output": 8611033, "vm restarts [base]": 9, "vm restarts [new]": 211 } 2025/08/15 20:16:21 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:16:21 repro finished 'general protection fault in __io_queue_proc (full)', repro=true crepro=true desc='general protection fault in __io_queue_proc' hub=false from_dashboard=false 2025/08/15 20:16:21 start reproducing 'general protection fault in __io_queue_proc (full)' 2025/08/15 20:16:21 found repro for "general protection fault in __io_queue_proc" (orig title: "-SAME-", reliability: 1), took 34.74 minutes 2025/08/15 20:16:21 "general protection fault in __io_queue_proc": saved crash log into 1755288981.crash.log 2025/08/15 20:16:21 "general protection fault in __io_queue_proc": saved repro log into 1755288981.repro.log 2025/08/15 20:17:22 repro finished 'general protection fault in __io_queue_proc', repro=false crepro=false desc='' hub=false from_dashboard=false 2025/08/15 20:17:22 failed repro for "general protection fault in __io_queue_proc", err=%!s() 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 "general protection fault in __io_queue_proc": saved crash log into 1755289042.crash.log 2025/08/15 20:17:22 "general protection fault in __io_queue_proc": saved repro log into 1755289042.repro.log 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:22 reproduction of "general protection fault in __io_queue_proc" aborted: it's no longer needed 2025/08/15 20:17:38 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:17:42 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:17:48 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:17:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:17:59 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:18:31 runner 0 connected 2025/08/15 20:18:36 runner 5 connected 2025/08/15 20:18:38 runner 7 connected 2025/08/15 20:18:48 runner 8 connected 2025/08/15 20:19:02 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:19:13 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:19:24 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:19:51 new: boot error: can't ssh into the instance 2025/08/15 20:19:51 runner 5 connected 2025/08/15 20:20:02 runner 4 connected 2025/08/15 20:20:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1327, "corpus": 4782, "corpus [files]": 1292, "corpus [symbols]": 956, "cover overflows": 46739, "coverage": 43345, "distributor delayed": 6134, "distributor undelayed": 6134, "distributor violated": 303, "exec candidate": 48399, "exec collide": 20334, "exec fuzz": 38576, "exec gen": 2016, "exec hints": 10894, "exec inject": 0, "exec minimize": 33199, "exec retries": 0, "exec seeds": 5450, "exec smash": 44389, "exec total [base]": 320580, "exec total [new]": 354255, "exec triage": 14591, "executor restarts": 657, "fault jobs": 0, "fuzzer jobs": 93, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 3, "hints jobs": 19, "max signal": 44891, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 19084, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 5, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5274, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 226, "reproducing": 2, "rpc recv": 8546724916, "rpc sent": 407644752, "signal": 42128, "smash jobs": 70, "triage jobs": 4, "vm output": 9121252, "vm restarts [base]": 9, "vm restarts [new]": 217 } 2025/08/15 20:20:55 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:21:51 runner 5 connected 2025/08/15 20:22:13 new: boot error: can't ssh into the instance 2025/08/15 20:22:33 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:23:11 runner 9 connected 2025/08/15 20:23:30 runner 0 connected 2025/08/15 20:23:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:24:19 patched crashed: WARNING in io_ring_exit_work [need repro = false] 2025/08/15 20:24:45 base crash: WARNING in io_ring_exit_work 2025/08/15 20:24:46 runner 9 connected 2025/08/15 20:25:17 runner 8 connected 2025/08/15 20:25:20 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:25:35 runner 1 connected 2025/08/15 20:25:43 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:25:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1395, "corpus": 4840, "corpus [files]": 1308, "corpus [symbols]": 972, "cover overflows": 49613, "coverage": 43502, "distributor delayed": 6209, "distributor undelayed": 6208, "distributor violated": 304, "exec candidate": 48399, "exec collide": 23073, "exec fuzz": 43740, "exec gen": 2300, "exec hints": 12314, "exec inject": 0, "exec minimize": 34439, "exec retries": 0, "exec seeds": 5627, "exec smash": 46443, "exec total [base]": 332108, "exec total [new]": 367483, "exec triage": 14742, "executor restarts": 673, "fault jobs": 0, "fuzzer jobs": 8, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 2, "hints jobs": 2, "max signal": 45079, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 19766, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 9, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5342, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 239, "reproducing": 2, "rpc recv": 8788469548, "rpc sent": 435816648, "signal": 42258, "smash jobs": 1, "triage jobs": 5, "vm output": 9702480, "vm restarts [base]": 10, "vm restarts [new]": 222 } 2025/08/15 20:25:53 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:26:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:26:09 runner 9 connected 2025/08/15 20:26:27 base: boot error: can't ssh into the instance 2025/08/15 20:26:34 runner 0 connected 2025/08/15 20:26:42 runner 5 connected 2025/08/15 20:27:08 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:27:44 new: boot error: can't ssh into the instance 2025/08/15 20:28:11 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:28:32 runner 6 connected 2025/08/15 20:29:00 base crash: WARNING in io_ring_exit_work 2025/08/15 20:29:29 new: boot error: can't ssh into the instance 2025/08/15 20:29:58 runner 3 connected 2025/08/15 20:30:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1441, "corpus": 4879, "corpus [files]": 1327, "corpus [symbols]": 990, "cover overflows": 52039, "coverage": 43653, "distributor delayed": 6264, "distributor undelayed": 6264, "distributor violated": 304, "exec candidate": 48399, "exec collide": 26182, "exec fuzz": 49787, "exec gen": 2601, "exec hints": 12549, "exec inject": 0, "exec minimize": 35390, "exec retries": 0, "exec seeds": 5741, "exec smash": 47342, "exec total [base]": 342295, "exec total [new]": 379274, "exec triage": 14873, "executor restarts": 687, "fault jobs": 0, "fuzzer jobs": 13, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 4, "hints jobs": 3, "max signal": 45265, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 20291, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5397, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 247, "reproducing": 2, "rpc recv": 8992021592, "rpc sent": 464275912, "signal": 42414, "smash jobs": 6, "triage jobs": 4, "vm output": 10192306, "vm restarts [base]": 11, "vm restarts [new]": 226 } 2025/08/15 20:31:14 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:31:24 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:32:14 runner 5 connected 2025/08/15 20:32:18 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:33:15 runner 4 connected 2025/08/15 20:33:33 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:34:30 runner 5 connected 2025/08/15 20:35:22 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:35:44 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:35:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1460, "corpus": 4909, "corpus [files]": 1344, "corpus [symbols]": 1007, "cover overflows": 54434, "coverage": 43796, "distributor delayed": 6316, "distributor undelayed": 6313, "distributor violated": 307, "exec candidate": 48399, "exec collide": 29354, "exec fuzz": 55754, "exec gen": 2901, "exec hints": 12598, "exec inject": 0, "exec minimize": 36023, "exec retries": 0, "exec seeds": 5834, "exec smash": 48166, "exec total [base]": 353347, "exec total [new]": 390402, "exec triage": 14969, "executor restarts": 696, "fault jobs": 0, "fuzzer jobs": 6, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 2, "hints jobs": 0, "max signal": 45471, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 20631, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5439, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 236, "reproducing": 2, "rpc recv": 9120962096, "rpc sent": 493521176, "signal": 42557, "smash jobs": 0, "triage jobs": 6, "vm output": 10640397, "vm restarts [base]": 11, "vm restarts [new]": 229 } 2025/08/15 20:36:10 new: boot error: can't ssh into the instance 2025/08/15 20:36:13 runner 0 connected 2025/08/15 20:36:32 base: boot error: can't ssh into the instance 2025/08/15 20:36:40 runner 6 connected 2025/08/15 20:36:47 new: boot error: can't ssh into the instance 2025/08/15 20:37:08 runner 8 connected 2025/08/15 20:37:14 new: boot error: can't ssh into the instance 2025/08/15 20:37:15 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:37:54 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:37:58 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:38:05 runner 5 connected 2025/08/15 20:38:16 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:38:17 new: boot error: can't ssh into the instance 2025/08/15 20:38:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:38:33 attempt #0 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:38:43 runner 0 connected 2025/08/15 20:39:09 runner 9 connected 2025/08/15 20:39:18 runner 6 connected 2025/08/15 20:39:20 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:39:35 new: boot error: can't ssh into the instance 2025/08/15 20:40:17 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:40:32 runner 7 connected 2025/08/15 20:40:32 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:40:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1475, "corpus": 4938, "corpus [files]": 1356, "corpus [symbols]": 1019, "cover overflows": 56983, "coverage": 43877, "distributor delayed": 6358, "distributor undelayed": 6358, "distributor violated": 310, "exec candidate": 48399, "exec collide": 32516, "exec fuzz": 61804, "exec gen": 3213, "exec hints": 12698, "exec inject": 0, "exec minimize": 36878, "exec retries": 0, "exec seeds": 5921, "exec smash": 48856, "exec total [base]": 363483, "exec total [new]": 401764, "exec triage": 15066, "executor restarts": 720, "fault jobs": 0, "fuzzer jobs": 7, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 5, "hints jobs": 0, "max signal": 45562, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 21121, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5478, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 232, "reproducing": 2, "rpc recv": 9403837804, "rpc sent": 523574072, "signal": 42634, "smash jobs": 2, "triage jobs": 5, "vm output": 11212979, "vm restarts [base]": 11, "vm restarts [new]": 237 } 2025/08/15 20:41:05 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:41:11 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:41:14 runner 5 connected 2025/08/15 20:41:19 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:41:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:41:41 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:41:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:41:59 runner 6 connected 2025/08/15 20:42:07 runner 9 connected 2025/08/15 20:42:21 runner 0 connected 2025/08/15 20:42:46 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:43:20 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:43:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:43:58 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:44:14 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:44:39 runner 4 connected 2025/08/15 20:45:09 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:45:11 runner 6 connected 2025/08/15 20:45:14 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:45:19 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:45:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:45:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1511, "corpus": 4961, "corpus [files]": 1369, "corpus [symbols]": 1032, "cover overflows": 59374, "coverage": 43934, "distributor delayed": 6400, "distributor undelayed": 6396, "distributor violated": 311, "exec candidate": 48399, "exec collide": 35713, "exec fuzz": 67923, "exec gen": 3569, "exec hints": 12772, "exec inject": 0, "exec minimize": 37333, "exec retries": 0, "exec seeds": 5987, "exec smash": 49380, "exec total [base]": 373870, "exec total [new]": 412625, "exec triage": 15145, "executor restarts": 741, "fault jobs": 0, "fuzzer jobs": 8, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 1, "hints jobs": 2, "max signal": 45637, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 21379, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5512, "no exec duration": 9566630000000, "no exec requests": 71255, "pending": 0, "prog exec time": 248, "reproducing": 2, "rpc recv": 9623105944, "rpc sent": 551805040, "signal": 42682, "smash jobs": 2, "triage jobs": 4, "vm output": 11558869, "vm restarts [base]": 11, "vm restarts [new]": 243 } 2025/08/15 20:46:02 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:46:08 runner 9 connected 2025/08/15 20:46:11 runner 7 connected 2025/08/15 20:46:38 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:46:52 runner 6 connected 2025/08/15 20:46:53 new: boot error: can't ssh into the instance 2025/08/15 20:46:54 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:47:15 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:47:42 runner 9 connected 2025/08/15 20:47:55 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:47:55 repro finished 'general protection fault in __io_queue_proc (full)', repro=true crepro=true desc='general protection fault in __io_queue_proc' hub=false from_dashboard=false 2025/08/15 20:47:55 found repro for "general protection fault in __io_queue_proc" (orig title: "-SAME-", reliability: 1), took 31.57 minutes 2025/08/15 20:47:55 "general protection fault in __io_queue_proc": saved crash log into 1755290875.crash.log 2025/08/15 20:47:55 "general protection fault in __io_queue_proc": saved repro log into 1755290875.repro.log 2025/08/15 20:48:03 runner 6 connected 2025/08/15 20:48:04 new: boot error: can't ssh into the instance 2025/08/15 20:48:39 base: boot error: can't ssh into the instance 2025/08/15 20:48:49 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 20:48:49 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 20:48:49 start reproducing 'general protection fault in __io_queue_proc' 2025/08/15 20:48:53 runner 8 connected 2025/08/15 20:48:53 runner 1 connected 2025/08/15 20:49:38 runner 6 connected 2025/08/15 20:49:47 attempt #1 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:49:55 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:50:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:50:44 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:50:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1535, "corpus": 4985, "corpus [files]": 1379, "corpus [symbols]": 1042, "cover overflows": 61110, "coverage": 44033, "distributor delayed": 6438, "distributor undelayed": 6437, "distributor violated": 312, "exec candidate": 48399, "exec collide": 38077, "exec fuzz": 72373, "exec gen": 3789, "exec hints": 12846, "exec inject": 0, "exec minimize": 37887, "exec retries": 0, "exec seeds": 6062, "exec smash": 50016, "exec total [base]": 382543, "exec total [new]": 421063, "exec triage": 15207, "executor restarts": 766, "fault jobs": 0, "fuzzer jobs": 2, "fuzzing VMs [base]": 2, "fuzzing VMs [new]": 2, "hints jobs": 0, "max signal": 45749, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 21685, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5538, "no exec duration": 9714969000000, "no exec requests": 71782, "pending": 0, "prog exec time": 161, "reproducing": 2, "rpc recv": 9898748832, "rpc sent": 575681136, "signal": 42780, "smash jobs": 0, "triage jobs": 2, "vm output": 11997755, "vm restarts [base]": 11, "vm restarts [new]": 251 } 2025/08/15 20:50:52 runner 8 connected 2025/08/15 20:50:57 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:51:00 runner 7 connected 2025/08/15 20:51:35 runner 9 connected 2025/08/15 20:51:42 attempt #0 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:51:45 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:51:47 runner 6 connected 2025/08/15 20:51:56 new: boot error: can't ssh into the instance 2025/08/15 20:52:35 runner 8 connected 2025/08/15 20:53:18 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:53:19 runner 2 connected 2025/08/15 20:53:26 new: boot error: can't ssh into the instance 2025/08/15 20:53:33 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:53:43 attempt #2 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:53:43 patched-only: general protection fault in __io_queue_proc 2025/08/15 20:54:07 runner 9 connected 2025/08/15 20:54:22 runner 1 connected 2025/08/15 20:54:41 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:55:15 new: boot error: can't ssh into the instance 2025/08/15 20:55:26 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:55:29 runner 8 connected 2025/08/15 20:55:42 attempt #1 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:55:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1568, "corpus": 5022, "corpus [files]": 1386, "corpus [symbols]": 1048, "cover overflows": 63760, "coverage": 44750, "distributor delayed": 6510, "distributor undelayed": 6510, "distributor violated": 313, "exec candidate": 48399, "exec collide": 41804, "exec fuzz": 79348, "exec gen": 4141, "exec hints": 12881, "exec inject": 0, "exec minimize": 38845, "exec retries": 0, "exec seeds": 6186, "exec smash": 50993, "exec total [base]": 388935, "exec total [new]": 434373, "exec triage": 15362, "executor restarts": 791, "fault jobs": 0, "fuzzer jobs": 7, "fuzzing VMs [base]": 2, "fuzzing VMs [new]": 4, "hints jobs": 1, "max signal": 46485, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 22231, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5598, "no exec duration": 9714969000000, "no exec requests": 71782, "pending": 0, "prog exec time": 211, "reproducing": 2, "rpc recv": 10217880212, "rpc sent": 602915080, "signal": 43457, "smash jobs": 4, "triage jobs": 2, "vm output": 12505900, "vm restarts [base]": 12, "vm restarts [new]": 259 } 2025/08/15 20:55:58 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:56:03 runner 4 connected 2025/08/15 20:56:08 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:56:18 runner 9 connected 2025/08/15 20:56:46 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:56:46 runner 6 connected 2025/08/15 20:56:56 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:56:57 runner 1 connected 2025/08/15 20:57:04 new: boot error: can't ssh into the instance 2025/08/15 20:57:16 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:57:34 attempt #2 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 20:57:34 patched-only: general protection fault in __io_queue_proc 2025/08/15 20:57:46 runner 8 connected 2025/08/15 20:58:03 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:58:05 runner 9 connected 2025/08/15 20:58:08 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:58:21 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 20:58:23 runner 1 connected 2025/08/15 20:58:45 base: boot error: can't ssh into the instance 2025/08/15 20:58:52 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:58:56 runner 6 connected 2025/08/15 20:58:59 runner 1 connected 2025/08/15 20:59:03 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 20:59:35 runner 0 connected 2025/08/15 20:59:41 runner 4 connected 2025/08/15 21:00:41 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:00:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1604, "corpus": 5038, "corpus [files]": 1394, "corpus [symbols]": 1056, "cover overflows": 66059, "coverage": 44784, "distributor delayed": 6532, "distributor undelayed": 6532, "distributor violated": 315, "exec candidate": 48399, "exec collide": 45129, "exec fuzz": 85317, "exec gen": 4457, "exec hints": 12896, "exec inject": 0, "exec minimize": 39258, "exec retries": 0, "exec seeds": 6236, "exec smash": 51384, "exec total [base]": 398763, "exec total [new]": 444893, "exec triage": 15405, "executor restarts": 820, "fault jobs": 0, "fuzzer jobs": 4, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 3, "hints jobs": 0, "max signal": 46512, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 22452, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5617, "no exec duration": 9714969000000, "no exec requests": 71782, "pending": 0, "prog exec time": 261, "reproducing": 2, "rpc recv": 10585878872, "rpc sent": 631558088, "signal": 43484, "smash jobs": 3, "triage jobs": 1, "vm output": 12909000, "vm restarts [base]": 14, "vm restarts [new]": 268 } 2025/08/15 21:01:28 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:01:38 runner 1 connected 2025/08/15 21:01:53 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:01:58 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:02:02 new: boot error: can't ssh into the instance 2025/08/15 21:02:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:02:42 runner 4 connected 2025/08/15 21:02:54 runner 9 connected 2025/08/15 21:02:58 runner 5 connected 2025/08/15 21:03:20 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:03:24 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:03:32 new: boot error: can't ssh into the instance 2025/08/15 21:03:39 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:04:01 runner 4 connected 2025/08/15 21:04:12 runner 1 connected 2025/08/15 21:04:20 runner 5 connected 2025/08/15 21:05:21 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:05:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1610, "corpus": 5051, "corpus [files]": 1398, "corpus [symbols]": 1060, "cover overflows": 67925, "coverage": 44888, "distributor delayed": 6578, "distributor undelayed": 6578, "distributor violated": 316, "exec candidate": 48399, "exec collide": 47856, "exec fuzz": 90506, "exec gen": 4699, "exec hints": 12911, "exec inject": 0, "exec minimize": 39568, "exec retries": 0, "exec seeds": 6275, "exec smash": 51748, "exec total [base]": 411042, "exec total [new]": 453846, "exec triage": 15469, "executor restarts": 842, "fault jobs": 0, "fuzzer jobs": 2, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 4, "hints jobs": 0, "max signal": 46636, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 22633, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5641, "no exec duration": 10225617000000, "no exec requests": 73881, "pending": 0, "prog exec time": 290, "reproducing": 2, "rpc recv": 10821454884, "rpc sent": 659727648, "signal": 43588, "smash jobs": 0, "triage jobs": 2, "vm output": 13359681, "vm restarts [base]": 14, "vm restarts [new]": 275 } 2025/08/15 21:06:36 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:06:45 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:06:51 new: boot error: can't ssh into the instance 2025/08/15 21:07:05 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:07:26 runner 1 connected 2025/08/15 21:07:41 runner 7 connected 2025/08/15 21:07:54 runner 9 connected 2025/08/15 21:08:26 new: boot error: can't ssh into the instance 2025/08/15 21:08:58 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:09:08 new: boot error: can't ssh into the instance 2025/08/15 21:09:52 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:09:54 runner 9 connected 2025/08/15 21:10:05 runner 8 connected 2025/08/15 21:10:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:10:36 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:10:37 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:10:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1634, "corpus": 5079, "corpus [files]": 1415, "corpus [symbols]": 1076, "cover overflows": 70488, "coverage": 44948, "distributor delayed": 6624, "distributor undelayed": 6624, "distributor violated": 317, "exec candidate": 48399, "exec collide": 51565, "exec fuzz": 97582, "exec gen": 5086, "exec hints": 12971, "exec inject": 0, "exec minimize": 40325, "exec retries": 0, "exec seeds": 6359, "exec smash": 52424, "exec total [base]": 423596, "exec total [new]": 466692, "exec triage": 15568, "executor restarts": 860, "fault jobs": 0, "fuzzer jobs": 0, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 3, "hints jobs": 0, "max signal": 46726, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 23079, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5682, "no exec duration": 10561542000000, "no exec requests": 75293, "pending": 0, "prog exec time": 297, "reproducing": 2, "rpc recv": 11013130812, "rpc sent": 693269152, "signal": 43645, "smash jobs": 0, "triage jobs": 0, "vm output": 14143632, "vm restarts [base]": 14, "vm restarts [new]": 280 } 2025/08/15 21:10:53 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:11:19 runner 4 connected 2025/08/15 21:11:35 runner 8 connected 2025/08/15 21:11:51 runner 9 connected 2025/08/15 21:12:10 new: boot error: can't ssh into the instance 2025/08/15 21:13:06 runner 6 connected 2025/08/15 21:13:37 new: boot error: can't ssh into the instance 2025/08/15 21:14:38 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:14:54 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:15:05 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:15:08 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:15:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1671, "corpus": 5112, "corpus [files]": 1430, "corpus [symbols]": 1090, "cover overflows": 73867, "coverage": 45047, "distributor delayed": 6669, "distributor undelayed": 6669, "distributor violated": 318, "exec candidate": 48399, "exec collide": 56517, "exec fuzz": 106686, "exec gen": 5554, "exec hints": 13020, "exec inject": 0, "exec minimize": 41126, "exec retries": 0, "exec seeds": 6458, "exec smash": 53237, "exec total [base]": 435332, "exec total [new]": 483112, "exec triage": 15700, "executor restarts": 875, "fault jobs": 0, "fuzzer jobs": 4, "fuzzing VMs [base]": 4, "fuzzing VMs [new]": 5, "hints jobs": 0, "max signal": 46885, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 23509, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5736, "no exec duration": 10704113000000, "no exec requests": 76006, "pending": 0, "prog exec time": 281, "reproducing": 2, "rpc recv": 11180304868, "rpc sent": 732488872, "signal": 43738, "smash jobs": 1, "triage jobs": 3, "vm output": 14763579, "vm restarts [base]": 14, "vm restarts [new]": 284 } 2025/08/15 21:15:51 runner 9 connected 2025/08/15 21:16:01 runner 6 connected 2025/08/15 21:16:07 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:16:43 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:17:42 runner 1 connected 2025/08/15 21:17:56 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:18:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:18:54 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:18:55 runner 9 connected 2025/08/15 21:19:01 runner 4 connected 2025/08/15 21:19:36 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:19:43 runner 1 connected 2025/08/15 21:19:55 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:20:02 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:20:11 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:20:17 base crash: WARNING in io_ring_exit_work 2025/08/15 21:20:21 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:20:33 runner 5 connected 2025/08/15 21:20:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1710, "corpus": 5183, "corpus [files]": 1449, "corpus [symbols]": 1109, "cover overflows": 77410, "coverage": 45407, "distributor delayed": 6736, "distributor undelayed": 6736, "distributor violated": 320, "exec candidate": 48399, "exec collide": 61146, "exec fuzz": 115563, "exec gen": 5988, "exec hints": 13101, "exec inject": 0, "exec minimize": 42893, "exec retries": 0, "exec seeds": 6673, "exec smash": 54971, "exec total [base]": 448699, "exec total [new]": 501027, "exec triage": 15881, "executor restarts": 899, "fault jobs": 0, "fuzzer jobs": 9, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 4, "hints jobs": 1, "max signal": 47263, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 24506, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5819, "no exec duration": 10704113000000, "no exec requests": 76006, "pending": 0, "prog exec time": 194, "reproducing": 2, "rpc recv": 11465123564, "rpc sent": 778060344, "signal": 44088, "smash jobs": 4, "triage jobs": 4, "vm output": 15390924, "vm restarts [base]": 14, "vm restarts [new]": 291 } 2025/08/15 21:21:00 runner 6 connected 2025/08/15 21:21:01 runner 7 connected 2025/08/15 21:21:06 runner 0 connected 2025/08/15 21:21:10 runner 8 connected 2025/08/15 21:21:20 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:21:24 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:22:21 runner 7 connected 2025/08/15 21:22:53 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:23:27 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:23:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:23:42 runner 9 connected 2025/08/15 21:24:03 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:24:03 repro finished 'general protection fault in __io_queue_proc', repro=true crepro=false desc='general protection fault in __io_queue_proc' hub=false from_dashboard=false 2025/08/15 21:24:03 found repro for "general protection fault in __io_queue_proc" (orig title: "-SAME-", reliability: 1), took 35.24 minutes 2025/08/15 21:24:03 "general protection fault in __io_queue_proc": saved crash log into 1755293043.crash.log 2025/08/15 21:24:03 "general protection fault in __io_queue_proc": saved repro log into 1755293043.repro.log 2025/08/15 21:24:24 runner 8 connected 2025/08/15 21:24:26 runner 4 connected 2025/08/15 21:25:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1737, "corpus": 5209, "corpus [files]": 1456, "corpus [symbols]": 1116, "cover overflows": 81435, "coverage": 45730, "distributor delayed": 6763, "distributor undelayed": 6763, "distributor violated": 320, "exec candidate": 48399, "exec collide": 67075, "exec fuzz": 126660, "exec gen": 6605, "exec hints": 13133, "exec inject": 0, "exec minimize": 43570, "exec retries": 0, "exec seeds": 6752, "exec smash": 55634, "exec total [base]": 461246, "exec total [new]": 520216, "exec triage": 15964, "executor restarts": 922, "fault jobs": 0, "fuzzer jobs": 5, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 7, "hints jobs": 1, "max signal": 47586, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 24879, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5853, "no exec duration": 10704113000000, "no exec requests": 76006, "pending": 0, "prog exec time": 261, "reproducing": 1, "rpc recv": 11750721412, "rpc sent": 822777576, "signal": 44409, "smash jobs": 4, "triage jobs": 0, "vm output": 16000894, "vm restarts [base]": 15, "vm restarts [new]": 298 } 2025/08/15 21:26:04 attempt #0 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 21:26:13 new: boot error: can't ssh into the instance 2025/08/15 21:26:24 patched crashed: general protection fault in __io_queue_proc [need repro = true] 2025/08/15 21:26:24 scheduled a reproduction of 'general protection fault in __io_queue_proc' 2025/08/15 21:26:24 start reproducing 'general protection fault in __io_queue_proc' 2025/08/15 21:26:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:26:57 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:27:07 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:27:14 runner 1 connected 2025/08/15 21:27:29 base crash: WARNING in io_ring_exit_work 2025/08/15 21:27:40 runner 7 connected 2025/08/15 21:27:41 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:27:43 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:27:46 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:27:53 runner 5 connected 2025/08/15 21:28:02 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:28:04 attempt #1 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 21:28:18 runner 1 connected 2025/08/15 21:28:36 runner 4 connected 2025/08/15 21:28:38 runner 1 connected 2025/08/15 21:28:39 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:29:02 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:29:04 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:29:14 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:29:14 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:29:36 runner 6 connected 2025/08/15 21:29:53 runner 5 connected 2025/08/15 21:30:02 attempt #2 to run "general protection fault in __io_queue_proc" on base: did not crash 2025/08/15 21:30:02 patched-only: general protection fault in __io_queue_proc 2025/08/15 21:30:02 scheduled a reproduction of 'general protection fault in __io_queue_proc (full)' 2025/08/15 21:30:02 start reproducing 'general protection fault in __io_queue_proc (full)' 2025/08/15 21:30:04 runner 8 connected 2025/08/15 21:30:35 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:30:42 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:30:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1766, "corpus": 5232, "corpus [files]": 1470, "corpus [symbols]": 1130, "cover overflows": 84125, "coverage": 45774, "distributor delayed": 6798, "distributor undelayed": 6796, "distributor violated": 320, "exec candidate": 48399, "exec collide": 71101, "exec fuzz": 134354, "exec gen": 7017, "exec hints": 13212, "exec inject": 0, "exec minimize": 44057, "exec retries": 0, "exec seeds": 6821, "exec smash": 56211, "exec total [base]": 470653, "exec total [new]": 533623, "exec triage": 16036, "executor restarts": 946, "fault jobs": 0, "fuzzer jobs": 6, "fuzzing VMs [base]": 2, "fuzzing VMs [new]": 3, "hints jobs": 0, "max signal": 47640, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 25132, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5884, "no exec duration": 10704438000000, "no exec requests": 76007, "pending": 0, "prog exec time": 271, "reproducing": 3, "rpc recv": 12060759348, "rpc sent": 856822032, "signal": 44451, "smash jobs": 3, "triage jobs": 3, "vm output": 16681581, "vm restarts [base]": 16, "vm restarts [new]": 306 } 2025/08/15 21:30:46 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:30:46 runner 3 connected 2025/08/15 21:30:51 runner 0 connected 2025/08/15 21:31:23 runner 7 connected 2025/08/15 21:31:26 new: boot error: can't ssh into the instance 2025/08/15 21:31:43 runner 4 connected 2025/08/15 21:31:55 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:32:05 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:32:10 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:32:10 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:33:06 runner 8 connected 2025/08/15 21:33:18 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:33:25 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:33:39 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:33:50 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:33:59 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:34:01 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:34:07 runner 4 connected 2025/08/15 21:34:12 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:34:29 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:34:47 runner 5 connected 2025/08/15 21:34:49 runner 8 connected 2025/08/15 21:35:01 runner 7 connected 2025/08/15 21:35:01 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:35:08 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:35:18 reproducing crash 'WARNING in io_ring_exit_work': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f kernel/time/sleep_timeout.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:35:18 repro finished 'WARNING in io_ring_exit_work', repro=true crepro=false desc='INFO: task hung in io_wq_put_and_exit' hub=false from_dashboard=false 2025/08/15 21:35:18 found repro for "INFO: task hung in io_wq_put_and_exit" (orig title: "WARNING in io_ring_exit_work", reliability: 1), took 117.38 minutes 2025/08/15 21:35:18 "INFO: task hung in io_wq_put_and_exit": saved crash log into 1755293718.crash.log 2025/08/15 21:35:18 "INFO: task hung in io_wq_put_and_exit": saved repro log into 1755293718.repro.log 2025/08/15 21:35:18 runner 4 connected 2025/08/15 21:35:27 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:35:43 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:35:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1778, "corpus": 5263, "corpus [files]": 1475, "corpus [symbols]": 1135, "cover overflows": 85943, "coverage": 45877, "distributor delayed": 6839, "distributor undelayed": 6839, "distributor violated": 322, "exec candidate": 48399, "exec collide": 73604, "exec fuzz": 139081, "exec gen": 7242, "exec hints": 13235, "exec inject": 0, "exec minimize": 44904, "exec retries": 0, "exec seeds": 6914, "exec smash": 57033, "exec total [base]": 483594, "exec total [new]": 542945, "exec triage": 16122, "executor restarts": 966, "fault jobs": 0, "fuzzer jobs": 2, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 2, "hints jobs": 0, "max signal": 47746, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 25534, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 10, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5921, "no exec duration": 10875299000000, "no exec requests": 76739, "pending": 0, "prog exec time": 231, "reproducing": 2, "rpc recv": 12407396712, "rpc sent": 889197720, "signal": 44550, "smash jobs": 0, "triage jobs": 2, "vm output": 17348876, "vm restarts [base]": 18, "vm restarts [new]": 314 } 2025/08/15 21:36:07 runner 0 connected 2025/08/15 21:36:20 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:36:22 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:36:27 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:36:32 runner 8 connected 2025/08/15 21:36:49 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:37:13 new: boot error: can't ssh into the instance 2025/08/15 21:37:25 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:37:50 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:38:02 runner 9 connected 2025/08/15 21:38:13 runner 8 connected 2025/08/15 21:38:15 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:38:33 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:39:20 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:39:22 runner 9 connected 2025/08/15 21:39:47 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:40:30 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:40:41 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:40:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1805, "corpus": 5279, "corpus [files]": 1482, "corpus [symbols]": 1142, "cover overflows": 87584, "coverage": 45909, "distributor delayed": 6873, "distributor undelayed": 6871, "distributor violated": 328, "exec candidate": 48399, "exec collide": 75571, "exec fuzz": 142706, "exec gen": 7484, "exec hints": 13350, "exec inject": 0, "exec minimize": 45368, "exec retries": 0, "exec seeds": 6968, "exec smash": 57483, "exec total [base]": 490557, "exec total [new]": 549910, "exec triage": 16172, "executor restarts": 982, "fault jobs": 0, "fuzzer jobs": 2, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 0, "hints jobs": 0, "max signal": 47779, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 25761, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 12, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5943, "no exec duration": 11665410000000, "no exec requests": 79952, "pending": 0, "prog exec time": 236, "reproducing": 2, "rpc recv": 12588340700, "rpc sent": 910112904, "signal": 44581, "smash jobs": 0, "triage jobs": 2, "vm output": 17867753, "vm restarts [base]": 18, "vm restarts [new]": 319 } 2025/08/15 21:40:51 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:41:12 runner 9 connected 2025/08/15 21:41:22 runner 8 connected 2025/08/15 21:41:31 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:41:41 runner 0 connected 2025/08/15 21:42:07 base crash: kernel BUG in filemap_fault 2025/08/15 21:42:21 runner 9 connected 2025/08/15 21:43:03 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:43:03 runner 3 connected 2025/08/15 21:43:16 patched crashed: kernel BUG in filemap_fault [need repro = false] 2025/08/15 21:43:37 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:43:44 new: boot error: can't ssh into the instance 2025/08/15 21:44:07 runner 0 connected 2025/08/15 21:44:17 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:44:25 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:44:27 runner 9 connected 2025/08/15 21:44:51 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:45:13 new: boot error: can't ssh into the instance 2025/08/15 21:45:13 runner 8 connected 2025/08/15 21:45:22 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:45:24 base: boot error: can't ssh into the instance 2025/08/15 21:45:40 runner 9 connected 2025/08/15 21:45:45 STAT { "buffer too small": 0, "candidate triage jobs": 0, "candidates": 0, "comps overflows": 1808, "corpus": 5282, "corpus [files]": 1483, "corpus [symbols]": 1143, "cover overflows": 88565, "coverage": 45912, "distributor delayed": 6894, "distributor undelayed": 6891, "distributor violated": 329, "exec candidate": 48399, "exec collide": 77209, "exec fuzz": 145727, "exec gen": 7633, "exec hints": 13350, "exec inject": 0, "exec minimize": 45475, "exec retries": 0, "exec seeds": 6977, "exec smash": 57558, "exec total [base]": 495600, "exec total [new]": 554946, "exec triage": 16206, "executor restarts": 1004, "fault jobs": 0, "fuzzer jobs": 4, "fuzzing VMs [base]": 3, "fuzzing VMs [new]": 1, "hints jobs": 0, "max signal": 47839, "minimize: array": 0, "minimize: buffer": 0, "minimize: call": 25838, "minimize: filename": 0, "minimize: integer": 0, "minimize: pointer": 15, "minimize: props": 0, "minimize: resource": 0, "modules [base]": 1, "modules [new]": 1, "new inputs": 5956, "no exec duration": 12225275000000, "no exec requests": 81885, "pending": 0, "prog exec time": 190, "reproducing": 2, "rpc recv": 12844557020, "rpc sent": 926630768, "signal": 44584, "smash jobs": 0, "triage jobs": 4, "vm output": 18424798, "vm restarts [base]": 19, "vm restarts [new]": 327 } 2025/08/15 21:45:48 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:46:02 runner 5 connected 2025/08/15 21:46:26 new: boot error: can't ssh into the instance 2025/08/15 21:46:33 new: boot error: can't ssh into the instance 2025/08/15 21:46:58 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:47:12 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:47:22 runner 7 connected 2025/08/15 21:47:23 runner 4 connected 2025/08/15 21:47:46 runner 8 connected 2025/08/15 21:48:01 runner 9 connected 2025/08/15 21:48:09 patched crashed: general protection fault in __io_queue_proc [need repro = false] 2025/08/15 21:49:05 runner 4 connected 2025/08/15 21:49:17 attempt #0 to run "INFO: task hung in io_wq_put_and_exit" on base: crashed with INFO: task hung in io_wq_put_and_exit 2025/08/15 21:49:17 crashes both: INFO: task hung in io_wq_put_and_exit / INFO: task hung in io_wq_put_and_exit 2025/08/15 21:49:31 reproducing crash 'no output/lost connection': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:50:14 runner 0 connected 2025/08/15 21:50:33 base crash: WARNING in io_ring_exit_work 2025/08/15 21:50:42 status reporting terminated 2025/08/15 21:50:42 bug reporting terminated 2025/08/15 21:50:42 repro finished 'general protection fault in __io_queue_proc (full)', repro=false crepro=false desc='' hub=false from_dashboard=false 2025/08/15 21:50:58 reproducing crash 'general protection fault in __io_queue_proc': failed to symbolize report: failed to start scripts/get_maintainer.pl [scripts/get_maintainer.pl --git-min-percent=15 -f io_uring/poll.c]: fork/exec scripts/get_maintainer.pl: no such file or directory 2025/08/15 21:50:58 repro finished 'general protection fault in __io_queue_proc', repro=false crepro=false desc='' hub=false from_dashboard=false 2025/08/15 21:51:14 syz-diff (base): kernel context loop terminated 2025/08/15 21:55:54 syz-diff (new): kernel context loop terminated 2025/08/15 21:55:54 diff fuzzing terminated 2025/08/15 21:55:54 fuzzing is finished 2025/08/15 21:55:54 status at the end: Title On-Base On-Patched general protection fault in __io_queue_proc 322 crashes[reproduced] INFO: task hung in io_wq_put_and_exit 1 crashes [reproduced] WARNING in io_ring_exit_work 6 crashes 2 crashes kernel BUG in filemap_fault 1 crashes 1 crashes